Human-computer interaction interface

The human-machine interactive interface can be controlled by natural language voice control or by using the LineBot menu interface or inputting text dialogue. Other interactive interfaces can also be supported.

AI semantic engine develops on edge devices

It can be applied to edge or terminal devices, replacing touch with semantic voice control to avoid the infection of bacteria and viruses, and empowering devices to interact with humans in natural language.

Support chips from various brands

Ubestream can deploy a semantic engine for microcontroller which introduces ultra-low power consumption on edge. It can provide lightweight power consumption for IoT devices in a specific restricted resource platform.

AI semantic (NLP/NLU) STT(ARS)/TTS engine on edge devices

The National Development Council's research report pointed out that in the next three years, global AI will develop from cloud to edge (AI from Cloud to Edge). AI training (Training) will still be the main force of cloud AI, but AI inference will shift from cloud. To edge devices, that is, the popularization of AI-capable edge or terminal devices will become a trend. In the future, edge or terminal devices can also have AI responsiveness without connecting to the cloud and using GPU computing power. The key is The point is to use AI algorithm lightweight technology to enable AI edge computing to be implemented in micro servers at the edge or terminal devices, or even in chips.

Ubestream has successfully developed AI semantic edge computing technology, which can embed lightweight AI semantic algorithm into low- and medium-level low-power chips used in consumer electronics products. In the digital transformation driven by the post-epidemic era, the AI semantic voice control chip will replace the touch interface and be used in smart homes and smart hotels for air-conditioning, TV audio, light security control, smart stores and smart restaurants for multi-function ordering and ordering. Machines, autonomous vehicles or unmanned vehicles for smart transportation and smart transportation, mobile devices such as mobile phones, earphones, wearable watch bracelets, toys, video games, robots and even interactive devices such as virtual idols using XR technology.

AI semantic (NLU/NLP) chip applications

How NLP works?

NLP entails applying algorithms to identify and extract the natural language rules such that the unstructured language data is converted into a form that computers can understand.
The AI semantic engine is powered by Ubestream Inc. including the below features, The Characteristics of semantic engine on edge:
(1)Offline deployment on edge devices: Speech to text (ASR) for specific retrained command sets and text to speech (TTS) can be deployed in edge device in quickly. The specific command sets for a restricted resource in microcontroller system are important.
(2)AI Trending to a low power microcontroller: Ubestream can deploy a semantic engine for microcontroller which introduces ultra-low power consumption on edge. Therefore, it can provide lightweight power consumption for IoT devices in a specific restricted resource platform.
(3)Smart home & smart city: Technologies like as STT, TTS and semantic engine using on IoT devices which imply a smart appliance to smart home, eventually for smart city.
(4)Contactless: Ubestream provides a good application such as just talk to IoT devices. Devices will have some reply and action in contactless field to avoid COVID-19 for public infection.

Driving contactless digital transformation

In the digital transformation driven by the post-epidemic era, the AI semantic voice control chip will replace the touch interface and be used in scenarios such as smart homes, smart hotels, smart stores, and smart transportation.

Semantic AI on Edge Chip set embedded technology has achieved a leading position in global AI companies and established technical barriers. It currently supports natural language dialogues in Chinese and English, and will expand to Japanese dialogues in the future. At present, it has begun to negotiate and cooperate with well-known domestic and foreign company groups.

Why use AI semantic voice control chip ?

01

AI semantic voice control chip will replace the touch interface, safety and health are in line with the digital transformation driven by the post-epidemic era.

The technology can be used in smart homes and smart hotels, such as air-conditioning, televisions, light control, smart stores and restaurants, and multi-functional ordering machines.

02

Human-computer interaction interface, wearable devices, and AI semantic chip enables better user experience.

Smart transportation, smart autonomous vehicles, mobile devices, earphones, wearable watches, video games, robots, and even interactive devices such as virtual idols using XR technology. The AI NLU chip can be embedded as a co-processor, allowing these devices to use interactive NLU voice control by using GPU computing power – it can even work without connecting to the cloud.

03

The global AI industry is slowly shifting from cloud to edge. The application of this technology will therefore allow companies to stay ahead of future trends.

A research report conducted by the National Development Commission points out that the global AI market will shift from cloud to edge in future. Enterprises should implement AI semantic voice technology to gain an advance in the market, as well as seize the business opportunities available. Eventually, this will allow companies to increase market share.

Are you interested in our AI semantic voice control technology?

Welcome to inquire about business cooperation, technology introduction, strategic investment etc.