View printer-friendly version

<<  Back

Synaptics and Google Collaborate on Edge AI for the IoT

The collaboration will integrate Google’s ML core with Synaptics Astra™ AI-Native hardware and open-source software to accelerate the development of context-aware devices.

SAN JOSE, Calif., Jan. 02, 2025 (GLOBE NEWSWIRE) -- Synaptics® Incorporated (Nasdaq: SYNA) today announced that it is collaborating with Google on Edge AI for the IoT to define the optimal implementation of multimodal processing for context-aware computing. The collaboration will integrate Google’s MLIR-compliant ML core on the Synaptics Astra™ hardware with open-source software and tools. The combination will accelerate the development of AI devices for the IoT that support the processing of vision, image, voice, sound, and other modalities that provide context for seamless interactivity in applications such as wearables, appliances, entertainment, embedded hubs, monitoring, and control across consumer, enterprise, and industrial systems.

The Synaptics Astra AI-Native compute platform for the IoT combines scalable, low-power compute silicon for the device Edge with open-source, easy-to-use software and tools, a strong partner ecosystem, and wireless connectivity. The platform builds upon Synaptics’ foundation in neural networks, field-hardened AI hardware and compiler design expertise for the IoT, and refined, in-house support of a broad base of modalities. Google’s ML core is a highly efficient open-source machine learning (ML) core that is compliant with the multi-level intermediate representation (MLIR) compiler.

“We are on the brink of a transformative era in Edge AI devices, where innovation in hardware and software is unlocking context-aware computing experiences that redefine user engagement,” said Vikram Gupta, Senior Vice President and General Manager of IoT Processors, Chief Product Officer at Synaptics. “Our partnership with Google reflects a shared vision to leverage open frameworks as a catalyst for disruption in the Edge IoT space. This collaboration underscores our commitment to delivering exceptional experiences while validating Synaptics’ silicon strategy and roadmap for next-generation device deployment.”

“Synaptics’ embrace of open software and tools and proven AI hardware makes the Astra portfolio a natural fit for our ML core as we ramp to meet the uniquely challenging power, performance, cost, and space requirements of Edge AI devices,” said Billy Rutledge, Director of Systems Research in Google Research. “We look forward to working together to bring our capabilities to the broad market.”

Synaptics and the Synaptics logo are trademarks of Synaptics in the United States and/or other countries. All other marks are the property of their respective owners.

About Synaptics Incorporated
Synaptics (Nasdaq: SYNA) is leading the charge in AI at the Edge, bringing AI closer to end users and transforming how we engage with intelligent connected devices, whether at home, at work, or on the move. As the go-to partner for the world’s most forward-thinking product innovators, Synaptics powers the future with its cutting-edge Synaptics Astra™ AI-Native embedded compute, Veros™ wireless connectivity, and multimodal sensing solutions. We’re making the digital experience smarter, faster, more intuitive, secure, and seamless. From touch, display, and biometrics to AI-driven wireless connectivity, video, vision, audio, speech, and security processing, Synaptics is the force behind the next generation of technology enhancing how we live, work, and play. Follow Synaptics on LinkedInX, and Facebook, or visit www.synaptics.com.  

Synaptics and the Synaptics logo are trademarks of Synaptics in the United States and/or other countries. All other marks are the property of their respective owners.

Investor Relations
Munjal Shah
Synaptics
+1-408-518-7639
munjal.shah@synaptics.com

Media Contact
Synaptics Incorporated
Patrick Mannion
Director of External PR and Technical Communications
+1 631-678-1015
patrick.mannion@synaptics.com


SPECIAL NOTE REGARDING FORWARD-LOOKING STATEMENTS

This website contains forward-looking statements that are subject to the safe harbors created under the Securities Act of 1933, as amended, and the Securities Exchange Act of 1934, as amended. Forward-looking statements give our current expectations and projections relating to our financial condition, results of operations, plans, objectives, future performance and business, and can be identified by the fact that they do not relate strictly to historical or current facts. Such forward-looking statements may include words such as "expect," "anticipate," "intend," "believe," "estimate," "plan," "target," "strategy," "continue," "may," "will," "should," variations of such words, or other words and terms of similar meaning. All forward-looking statements reflect our best judgment and are based on several factors relating to our operations and business environment, all of which are difficult to predict and many of which are beyond our control. Such factors include, but are not limited to, the risks as identified in the "Risk Factors," "Management's Discussion and Analysis of Financial Condition and Results of Operations" and "Business" sections of our Annual Report on Form 10-K for our most recent fiscal year, and other risks as identified from time to time in our Securities and Exchange Commission reports. Forward-looking statements are based on information available to us on the date hereof, and we do not have, and expressly disclaim, any obligation to publicly release any updates or any changes in our expectations, or any change in events, conditions, or circumstances on which any forward-looking statement is based. Our actual results and the timing of certain events could differ materially from the forward-looking statements. These forward-looking statements do not reflect the potential impact of any mergers, acquisitions, or other business combinations that had not been completed as of the date of this filing.