“AI is the new electricity.” One does not have to look very hard to find statements such as this one from AI pioneer Andrew Ng. The interest in and excitement for AI is growing at an amazing pace, fueled by significant improvements on tasks such as object recognition in images, natural language processing, and speech recognition. Developing products and services leveraging AI requires a wide range of trade-offs and architectural decisions. One of the most important architectural choices that impacts both the technology development and the user experience is whether AI processing is done in the cloud or directly on the edge device.
For AI algorithms to perform well, they require large amounts of data and significant processing power; this has driven the majority of AI deployments to be primarily cloud-based. Among other benefits, the cloud gives these solutions access to significant computing resources, easy deployment and management, and access to a variety of data sources. However, as computing resources in edge devices increases, it is becoming feasible to deploy AI directly onto edge devices.
While it is possible for edge devices to run AI algorithms, it is still critical to evaluate whether it actually makes sense to move the processing out of the datacenter and onto the edge. Deploying AI on the edge has challenges such as increased development cost, tradeoffs between cost and performance, and complexity of managing deployed models on distributed devices. To evaluate whether AI at the edge is appropriate we suggest asking these three simple questions. If any of these questions are true, AI on the edge is likely a good fit for the application.
- Does the application require low latency responses?
For many applications, the time it would take to send data to the cloud, have it processed, and have the result sent back to the edge device would be prohibitively long. Two clear examples of this are autonomous vehicles which need to make split-second decisions, and voice control interfaces where reductions in latency significantly improve the user experience.
- Are the sensors on the edge generating more data than can be sent to the cloud for processing?
There are three primary reasons data may not be able to be sent to the cloud: the cost of sending the data is too high, the power required to send the data is infeasible, or the connectivity simply does not have the through-put to support the data volume. In each of these cases by moving the AI algorithms to the edge there can be significant improvements in operational costs, battery life, and feasibility.
- Is the connectivity limited such that the connection to the cloud is not guaranteed?
If an application requires a device to perform actions based on sensor data, but the connection to the cloud is not guaranteed, the uptime and performance can be improved significantly by deploying the AI algorithm on the edge. For example, when the voice assistant on a mobile device relies on the cloud for processing, the user experience fails without an internet connection. By moving a basic version of the voice recognition engine onto the mobile device, some base functionality is still provided independent of an internet connection.
Excitement around AI is likely to continue as the technology’s capabilities continue to grow. However, not all applications are well-suited for AI on the edge. By ensuring that AI on the edge makes sense for your solution or product, you can maximize your likelihood of success. As you consider how best to apply AI in your products and solutions, hopefully these simple questions can provide some insight and guidance. Are you curious about whether or not AI is the right technology solution for your product? Call us at 616.393.7766 or send us a note at hello@twisthink.com, we would love to explore this further with you!
Turn your data into immediate business value?
Our AI assessment will help you better understand your data, identify feasible AI opportunities, and provide strategic recommendations on how you can add a whole new level of intelligence to your business model.
Learn more