There’s more open source development functionality on the way for microcontroller-driven applications running from Arm’s products.

Callum Cyrus

June 22, 2021

4 Min Read
Green microchip set in a blue printed circuit board
microchip integrated on motherboardThinkstock

Arm has launched an open source initiative to encompass elements of its abstraction software for IoT microcontroller applications.

The Common Microcontroller Software Interface Standard (CMSIS) provides development access to Arm’s Cortex-M microprocessor range, along with peripherals and middleware.

The new Open-CMSIS-Pack aims to help software engineers reuse command-line development tools and programming code across multiple microcontroller-based applications.

Arm’s announcement comes as the Internet of Things (IoT) experiences growth given device rollouts and computational upgrades, bringing new IoT devices to hospitals, gyms and factories.

Why Microcontroller Units for IoT?

But bandwidth congestion and security concerns have sparked demand for beefed-up end devices, to support data protection and reduce latency.

Microcontroller units (MCUs) are integrated stacks that contain not just the microprocessor but also built-in RAM, connectivity options and ports toward external inputs such as sensors. In effect, this makes them a self-contained computer.

Many IoT devices currently in operation use microcontroller units. They tend to be more cost- and space-efficient than using microprocessors coupled with miniaturized peripherals. A low-voltage microcontroller can run using just milliwatts of electricity, but it has less computational power and flexibility than traditional CPUs.

Basic IoT devices – dumb sensors, for example, with just a 1 and 0 switch – don’t require a microcontroller.

IT pros must weigh several objectives to determine which microcontroller best fits their needs. Certain MCUs units dissipate substantial heat from their circuits, which poses difficulties for always-on devices such as industrial sensors.

Another key decision is whether to go for a single or multi-task architecture.

Consider the implications of Moore’s Law – whereby transistors on the typical semiconductor wafer doubles every two years or so. As embedded microcontrollers become more powerful, IoT devices will evolve to supplant more cloud-hosted functionality. But only if manufacturers can prevent inflated development lead-times and energy usage.

Higher-end MCUs often combine low-energy controllers with processing units that are dedicated to implementing deep learning networks, for purposes such as computer vision. An MCU-hosted algorithm might comb through videos as stored on the end-device, before dispatching only relevant information to cloud-hosted analytics services. Meanwhile, cybersecurity flaws in IoT networks could be automatically identified using advanced machine learning technology.

According to Omdia’s research, high-power embedded microcontrollers are winning market share.

Revenue for 32-bit MCUs chipsets could reach around $2.2 billion this year, versus around $650 million for 8 or 16-bit products, according to Omdia.

The divide was far narrower in 2016, when 8 or 16-bit chipsets generated $800 million of revenue versus $1.3 billion for 32-bit models.

That suggests more IoT devices installed today are going beyond fundamental sensory awareness to fulfil complex computational requirements.

But adding 32-bit units to the mix risks overwhelming IT pros, who must ensure IoT networks avert sprawl and maintain compatibility with legacy systems.

It was in this vein that Arm’s CMSIS was designed. The idea is to address MCU software compatibility issues, given the wide assortment of IoT endpoints.

For example, IT pros can access support for almost 9,000 distinct microcontroller products via CMSIS’s software packaging component.

The trajectory to simplified MCU development emulates the progression of computer microprocessors in the 1970s.  At the beginning of the personal computing revolution, developers were presented with a string of 8 and 16-bit processors, but the CPU’s instruction architecture was rationalized in the following decade, which made building software simpler.

With NVIDIA’s proposed acquisition on the horizon, Arm is working to cement its foothold in the IoT microcontroller sector.

Its MCUs build on the compact, machine learning-focused processors in its Cortex-M range. Edge-driven computer vision and anomaly detection are some of the biggest draws.

“In the wireless MCUs that I track, the ARM architecture dominates almost completely. Five years ago, I’d say that ARM accounted for maybe 60% of wireless MCU designs and maybe about 80% of the unit volume,” said Lee Ratliff, senior principal analyst at Omdia. “Today, I’d say those numbers are closer to 90% and 95% respectively.”

Ultimately, the trend should go a long way to accomplishing the mission of bringing computational loads to the edge. In practice, however, the shift will require more than just highly powered MCUs, Ratliff said.

Most enterprises need to think through the logistics of distributing compute between edge and cloud settings.

Infrastructure designs will be crucial – there’s scarce use in running machine learning from endpoints, only for the results to be stuck in a network logjam.

“MCUs have become more powerful and that may have shifted some of the computational load from the cloud, but I don’t think it’s become a [definite] trend yet.

“More will come in future. I don’t believe that a lack of computational power is what is preventing edge computing,” said Ratliff. “I think splitting the computation between edge and cloud is a pretty significant change in how the IoT is architected and changes of that magnitude always take longer than you would think.”

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like