Why Global Semiconductor Firms are Reluctant to Invest in AI, IoT Amid Huge Growth Potentials

Published  December 20, 2023   0
Leveraging AI in semiconductor design and manufacturing

IoT and AI has the higher chance to escalate R&D expenses, requires a completely skilled workforce, and includes cross-border regulation.

Semiconductors are now the backbone of modern electronic products and are playing an imperative role in our daily lives. They are the building blocks for the integrated circuits, which power everything from computers to smartphones, and every essential home appliance. For the past few decades, technology has evolved tremendously and for instance, artificial intelligence (AI) and Internet of Things (IoT) are playing a pivotal role in almost every industrial sector. The procedure of semiconductor manufacturing includes designing devices and utilizing photolithography to place the circuits accurately on the silicon wafers. 

The chip industry is growing at a rapid pace all over the world and it includes firms that design and produce chip devices and components, including integrated circuits and transistors. The industry has now turned to be highly competitive coupled with major companies investing billions in research and development. Now, the point is that AI is playing an important role in the semiconductor industry. The technology is transforming the sector by improving the design, testing, and manufacturing of chips. 

While speaking of design, AI helps engineers to optimize the architecture of semiconductors and performance by quickly examining the colossal datasets, and at the same time, identifying patterns, that could be an impediment for the engineers to detect. Moreover, AI is now helping the industry’s design pattern and perks-up the potentials of chip-enabled devices. On the other hand, the verification procedure is now completely automated with the help of AI, which ensures the reliability of complex semiconductor designs. The global AI market is forecast to grow to $390.9 billion by 2025, representing a compound annual growth rate of 55.6 percent over that short period.

Image of AI

Akshara Bassi, Senior Research Analyst at Counterpoint told Circuit Digest exclusively, “AI helps in accelerating chip design and development times. At the same time, ML and AI helps analyze large datasets to generate efficient chip architectures. Additionally, they help in simulating the chip performance before it goes to production in the foundry. The simulation can help in tweaks for improved performance and enhanced functionality. Now, if you speak about the role of AI in semiconductor manufacturing or the foundry level, AI helps to activate predictive maintenance by analyzing historical production patterns for lower downtime and increased productivity of site. Additionally, the continual iterative learning process helps in improving yield rate and lower wastage. The growth looks exponential as companies would try to integrate AI into their businesses and processes for faster GTM and lower cost of chip design."

The crucial aspect is how AI is going to impact the global chip production and design. The massive demand of AI will now have a huge impact on this industry because the volume of data stored and processed by AI applications is huge. According to the experts, it is an urgent need of the hour to improve chip architecture in an effort to address the data utilization in AI based integrated circuits. The AI based chip design will be not only about boosting the overall performance, but also about escalating the data movement both in and out of the memory coupled with more augmented power and accurate memory systems. According to an exclusive engineering report published in irds.ieee.org, one option is the design of chips for AI neural networks that perform like human brain synapses. Instead of sending constant signals, such chips would “fire” and send data only when needed. Nonvolatile memory may also see more use in AI-related semiconductor designs. Nonvolatile memory can hold saved data without power. Combining nonvolatile memory on chips with processing logic would make “system on a chip” processors possible, which could meet the demands of AI algorithms.

Semiconductor experts have clearly highlighted that the improvements in semiconductor design are increasing to counter the data demands of AI applications, but they increase a bundle of production and manufacturing hurdles. As the requirement of memories have increased, AI based chips have also increased over the years. The semiconductor size is so huge now, it is not an easy task for a chip vendor to gain profit, while working on specialized hardware. The simple reason is for each application, producing dedicated AI chips becomes very expensive. In order to counter this impediment, a general-purpose AI dais would be useful. It will be easier for the vendors to escalate this general-purpose podium with inputs/outputs, accelerators, and sensors. Manufacturers will be then able to customize the podium for various workload requirements of any application, while also reducing expenses. 

Now, from the perspective of production, this industry will also gain huge benefits from AI adoption. At all the process points, AI will be there to decrease manufacturing time, boosting production efficacies, and decrease material losses. For the past couple of years, the chip industry has earned most of its revenues from the mobile device and smartphone market. The smartphone industry has already hit the peak, the chip industry should find other industries for its growth. The AI applications mostly in industrial robotics, autonomous vehicles, and big data can offer growth opportunities to the chip industry. By defining a new AI strategy, the chip companies can grab the full benefits of the emerging AI market. 

Along with AI, the Internet of Things (IoT) is another important technology, which is very likely to disrupt the chip sector in the business and industrial domains. IoT has the potential to turn almost every device into smart devices starting from retail to consumer products, medical science, and irrigation. Industrial IoTs are in huge demand these days. As the increasing demand for IoT solutions is poised to generate huge revenues, McKinsey Global Institute stated that IoT applications will generate between $4 trillion and $11 trillion globally in 2025. This huge growth offers both complexities and opportunities for the chip industry globally. 

Image of IoT

Mohit Agrawal, Senior Research Analyst who specializes in digital transformation, AI and IoT market told Circuit Digest, “The key areas where IoT is going to make a difference in manufacturing is predictive maintenance, enhancing efficiency, asset tracking, etc. IoT and AI (computer vision) is being used for quality control in a big way. Another concept that is gaining currency is Digital Twin. We can have a digital twin of a machine, system or the entire manufacturing set up. Digital Twins enables us to simulate by taking the real time data into account."

“IoT applications fail to work without integrated circuits and dedicated sensors and therefore, every IoT device is in need of semiconductors. The mobile phone or the smartphone market, which has boosted the chip market for several years, has now finally started taking rest. The chip vendors have the higher chance of grabbing new revenue opportunities with the help of IoT, helping the sector to maintain an average yearly growth rate of 3 to 4 percent in the future. Most importantly, the IoT backed devices would augment demand for integrated circuits, microcontrollers, sensors, memory, and connectivity, which has the higher chance of putting pressure on the current semiconductor supply chain ecosystem,” added Agrawal.

According to some industry sources, who wish to be unnamed, said that IoT applications mostly require a very small microcontroller, which can be embedded in smaller electronics. Therefore, there will be a huge pressure on the semiconductor industry to come out with innovative technology required for small chips, while also maintaining the chip power consumption. There is a possibility that the current base material silicon, which is used in integrated circuits, could be replaced with gallium-arsenide.  

A few researchers still doubt that IoT is going to dominate the international market for chips, because a lot of companies including the renowned ones are putting attention to the R&D of this sector. The major challenge is applying the technology. IoT products have specialized requirements, which have niche markets and very low sales. This thinking has left most of the chip firms to detest billions of dollars investments in IoT. The technology has the higher chance to escalate R&D expenses, requires a completely skilled workforce, and includes cross-border regulation.

Have any question realated to this Article?

Ask Our Community Members