This is the Trace Id: a4a8f658d6b3e7c94622c1b8185f112d
January 7, 2025

Sony challenges automated driving by development of Sensor Fusion System using Azure

Advancing automated driving systems requires improved accuracy in imaging and sensing technologies, along with extensive data processing. Currently, the on-premises infrastructure needs time and effort to manage large data and machine learning tasks.

Sony Semiconductor Solutions Corporation migrated its development environment to Azure and introduced AVOps, a reference architecture for automated driving development. With this environment, the company’s "Sensor Fusion System" development has progressed significantly.

 

AVOps streamlines data ingestion, annotation and machine learning workflows, accelerating the development of sensor fusion systems and improving recognition performance. An image scene analyzing system using Azure AI Studio is now being validated.

Sony Semiconductor Solutions Corporation

Transferring the development environment for sensor fusion systems to Azure

One technology people expect to see everywhere in the near future is automated driving systems in automobiles. As these systems grow in sophistication, we can expect them to spawn all kinds of solutions to social problems. They can be expected to solve the shortage of drivers, for example, prevent traffic accidents, and give rise to new ways of passing the time during journeys (“secondary activities”). Automated driving is sure to prove a touchstone for an array of innovations.

In Japan, vehicles are already commercially available that feature partial automation of acceleration, braking and steering. Automakers and related companies, IT firms and various levels of government are working together to drive this technical revolution forward, along with the necessary legal framework, to raise the level of automated driving systems. At the same time, a number of hurdles remain to be overcome. Among these are the imaging and sensing technologies that serve as the “eyes” of the automobile of the future.

Sony Semiconductor Solutions Corporation (SSS) is a developer of semiconductor devices, with strengths in imaging and sensing technologies. The company’s main area of business is the image sensor field, where it holds the global top position by value. In this field, SSS began full-fledged development of image sensors for automotive use in 2014. Drawing on technology it accumulated developing image sensors for digital cameras and smartphones, SSS is striving to improve imaging and sensing technologies for drive assist and other features to come.

To advance the development of image sensors for on-board (vehicle-mounted) cameras, SSS had to apply its existing strengths in some ways but change its way of thinking in others. So argues Tsutomu Haruta, Senior General Manager of SSS’ Automotive Business Division.

“As a company we’ve always advanced imaging technologies that move people with the clarity and beauty of the images they capture,” explains Haruta. “But to develop image sensors for on-board cameras, what we need is sensing technology: technology that can collect the data that AI and other systems need to recognize objects, even when traveling at highway speeds and at night. In the Automotive Business Division, as we develop image sensors dedicated to sensing applications, we ask questions such as: What kinds of data are useful to collect for on-board cameras? And what does it mean to say that a machine ‘recognizes’ an object anyway?”

The purpose of on-board cameras, Haruta explains, is not to output images for people to see, as digital cameras and smartphone cameras do, but to capture the data inherent in objects in fine detail and hand them off to input devices that use it. “For data to be put to beneficial use, the recognition phase is essential,” explains Haruta. This understanding led the Automotive Business Division to develop a sensor fusion system, a system that combines input from multiple sensors to enhance the sensors’ recognition performance.

Shinji Igarashi is Senior Manager, Automotive Sensing Development Department in SSS' Automotive Business Division. Igarashi recalls, “Our department began work on software development in 2016. Back then we were building our systems on-premises. Dealing effectively with the huge volumes of data contained in each image was incredibly costly in terms of time and labor.” At the same time the company had decided to participate in the Green Innovation Fund sponsored by Japan’s Ministry of Economy, Trade and Industry (METI). This development gave SSS fresh encouragement to develop automated-driving technology.

“The on-board cameras in one vehicle typically capture several terabytes of data a day,” explains Igarashi. “Knowing that we had to develop our software quickly, even as we handled these vast reams of data, we opted to move our development environment to Microsoft Azure, a cloud service. Data captured in the vehicles was transferred to Azure using the Azure Data Box service. Using the Data Box family of offline transfer devices, data was moved from car to cloud in a fast, secure, and reliable manner”

Developing the software required sharing data not only within the company but with outside partner companies as well. To work efficiently, SSS needed a cloud environment that would make data management easy. According to Igarashi, one reason SSS chose Microsoft Azure was its wide range of platform-as-a-service (PaaS) services, which could be combined for efficient use.

Mr. Tsutomu Haruta, General Manager, Automotive Div., Sony Semiconductor Solutions Corporation

After we upgrade our image sensors as input devices, I believe future prospects include not only automated driving but also the use of machines with features that exceed human vision. These innovations will make our lives more fulfilling and enriching

Mr. Tsutomu Haruta, Senior General Manager , Automotive Business Div., Sony Semiconductor Solutions Corporation

Adopting AVOps, Microsoft’s reference architecture for developing automated driving

With the formation of this development environment, SSS’ development of sensor fusion systems has advanced dramatically. To reprise, a sensor fusion system is a system that combines input from multiple sensors to enhance sensors’ recognition performance. Each sensor has its own special function, so the sensors complement each other, enhancing the recognition performance of the entire sensor system.

Supporting the workflow of the sensor fusion system development process is AVOps, Microsoft’s reference architecture for developing automated driving.

“AVOps was unveiled at the 2021 Consumer Electronics Show (CES),” notes Igarashi. “It’s an architecture that ties together the wide range of solutions necessary to make automated driving happen, and it includes a lot of content that we felt could be a useful reference for our work. And since we had already decided to port our development environment to Azure, this architecture was the easiest for us to use.”

Having implemented AVOps, SSS set out to build a customized infrastructure for its own use, guided by a three-year plan. However, because METI’s Green Innovation Project had already begun, the infrastructure build-out and the Green Innovation Project had to be implemented in tandem.

To handle this state of affairs, SSS adopted a strategy of successive releases, starting with the features that addressed the company’s most pressing issues and that were central to the infrastructure. The company focused on three priority items: “ingest processing,” the processing, saving and extraction of necessary data from within the acquired data; “annotation,” the provision of attributes for machine learning; and “machine learning” by means of virtual machines.

“Refining the ingest process was always a major issue,” reflects Igarashi. “Annotation required trading data with partner companies at high speeds. And because the PCs we used for machine learning were expensive, we didn’t have the budget for an array of multiple PCs of the necessary caliber. By using Azure Machine Learning, we were able to restrain costs while carrying out machine learning in parallel.”

AVOps broadly consists of three processes. The first is DataOps, which uploads to Azure sensor data captured from moving vehicles using the Azure Data Box service and manages the ingestion and annotation of the saved data. The second is MLOps, which manages the workflow for development of the machine-learning model using the data provided by DataOps. The third process, ValOps, evaluates the learning in the MLOps process. SSS has already completed the buildout of DataOps and MLOps and is now working on construction of ValOps.

“The services deployed on Azure, including AVOps, are very diverse ,” marvels Igarashi. “The more services we implement, the more efficient our workflow becomes. But some services have a greater effect on our project than others. In developing the infrastructure, we took care not to over-engineer it, in view of cost concerns.” Igarashi emphasizes that focusing on judging the optimum solution is essential for the construction of an effective infrastructure.

Aiming for further efficiencies with the introduction of Azure AI Studio

SSS is now working on the next move: increasing work efficiency through the application of generative AI.

One key question for the efficient development of a sensor fusion system is how precisely the data necessary for machine learning and the data that need to be evaluated can be extracted from the oceans of data acquired. As the volume of data grows and grows, the task of extraction only becomes more complicated. SSS used Azure AI Studio to build a system that can analyze visual scenes with high efficiency and is currently testing it.

“The old approach was to append tags, embedding words in image data. This was an inordinately labor-intensive process, and if the embedded words didn’t match the search terms exactly the information would be overlooked,” laments Igarashi. “This time we’re using generative AI to embed detailed scene descriptions in images and using natural language to search them. We think this approach will make searches instantaneous, raising both efficiency and retrievability.”

Images can be searched not only by natural language but by comparison with other images as well, so similar scenes can be searched without the need to use language. Eventually SSS hopes to use generative AI for tasks such as checking the quality of annotation, troubleshooting recognition errors and generation of test images.

“We think we’ll realize the true value of generative AI if we can use it to strengthen our model for learning from recognition errors, thereby improving recognition performance, and automate the evaluation cycle,” Igarashi explains.

Microsoft Japan assigned engineers to SSS to implement Azure AI Studio, and the Microsoft Japan engineers and SSS engineers worked closely together to move the project forward. Igarashi praises this collaboration, saying, “The project not only got Azure AI Studio implemented quickly but also stimulated the talents of our engineers.”

Mr. Shinji Igarashi, Manager, Automotive Sensing Development Dept, Automotive Div., Sony Semiconductor Solutions Corporation

This time we’re using generative AI to embed detailed scene descriptions in images and using natural language to search them. We think this approach will make searches instantaneous, raising both efficiency and retrievability

Mr. Shinji Igarashi, Senior Manager, Automotive Sensing Development Dept., Automotive Business Div. Sony Semiconductor Solutions Corporation

It’s vital that the people using these tools stay laser-focused on their goals.

“Development of infrastructure and development platforms never ends,” observes Igarashi. In its infrastructure buildout project, implemented according to a three-year plan starting in 2022, SSS has now reached certain milestones. In view of this accomplishment, the company is now setting its sights on expanding the system’s feature set while improving and automating its operation.

“While we appreciate that using cloud tools to build the infrastructure has improved the efficiency of our development work, we must never forget that these are only tools,” cautions Igarashi. “The people who use these tools must keep their eyes on the goal, maintaining full and open communication to ensure that we continue to use only the necessary methods,” adds Igarashi, as Haruta nods in enthusiastic agreement. “After we upgrade our image sensors as input devices, I believe future prospects include not only automated driving but also the use of machines with features that exceed human vision. These innovations will make our lives more fulfilling and enriching,” says Haruta. Tools and technologies will change over time, he emphasizes, but what will not change is the aim of more prosperous and fulfilling lives for people. Asked for a concluding message, Haruta affirmed his expectation of further collaboration with Microsoft Japan going forward: “We’re a company focused on manufacturing; Microsoft Japan is a company focused on systems. By working together, I believe we can bring that bright future closer than ever.”

SSS’ corporate slogan is “Sense the Wonder.” The world, says SSS, is overflowing with wonderful things to surprise and amaze us. SSS is always moving forward, seeking to expand the limitless possibilities of humanity and blaze a trail to a bright future. Microsoft Japan is proud to accompany SSS on its journey of transformation.

Mr. Tsutomu Haruta, General Manager, Automotive Div., Sony Semiconductor Solutions Corporation

We’re a company focused on manufacturing; Microsoft Japan is a company focused on systems. By working together, I believe we can bring that bright future closer than ever

Mr. Tsutomu Haruta, Senior General Manager, Automotive Business Div., Sony Semiconductor Solutions Corporation

Take the next step

Fuel innovation with Microsoft

A man wearing headphones and smiling

Talk to an expert about custom solutions

Let us help you create customized solutions and achieve your unique business goals.
A woman smiling and a pointing to a screen showing some statistics

Drive results with proven solutions

Achieve more with the products and solutions that helped our customers reach their goals.

Follow Microsoft