Next generation of edge computing: AI, microservices and container orchestration

Next generation of edge computing: AI, microservices and container orchestration


Nvidia Jetson, Kubernetes, CUDA, arm64 and Tensorflow to boost your AI projects and edge analytics

Part 1 : Development of an AI with Keras and Tensorflow and optimization with CUDA and TensorRT

Part 2 : Deployment as a containerized microservice in Kubernetes with GPU acceleration support on the Nvidia Jetson Nano and performance monitoring

Current edge IoT / computing solutions are now offering some great integration capabilities both for the field devices and the cloud platform vendors. They create the bridge we need to integrate heterogeneous protocols, design our business logic without dealing with the specifics and offer in the meantime the necessary connectivity to consolidate our data in some business-oriented dashboards containing our KPIs in the Cloud.

But as the volume of data we need to crunch continuously increase, pushing back to the cloud this huge amount of traffic is not optimum in term of cost and decision taking time response.

Also, for critical assets it is obvious that the cloud integration, especially with public ones, is a big security concern. Even with a proper architecture design, it is not possible to completely prevent the risks and a local processing is always preferable for critical processes when possible.

The new generation of edge controllers is tackling those challenges by both hardening the security in the lowest layers (e.g. secured encrypted boot) and embarking GPU co-processing to facilitate the AI Deep Learning models computation. Therefore, the data processing and decision taking can safely be done close to the assets, on-premise.

If you want to know more you can read the full article on Medium.


To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics