Contact Us
    Nx News

    Insights and Innovations: Highlights from Embedded Vision Summit 2024

    by | May 31, 2024

    Insights and Innovations: Highlights from Embedded Vision Summit 2024
     

    This past week marked the highly anticipated Embedded Vision Summit, a gathering highlighting the latest advancements in the realm of (edge) vision and AI applications.

    We were delighted to be part of the show and had amazing discussions with software engineers building new edge AI and vision applications, looking to scale them to a global level - exactly what the Nx Toolkit helps them to accomplish. We also engaged with visionaries pushing the boundaries of new edge AI hardware and algorithms. Now that the dust has settled a bit, let’s take a short look at the learnings from the summit and our next steps.

    Network Optix @ the Summit

    Network Optix was present at the summit in multiple ways. We had an impressive booth on the exhibition floor, showcasing a myriad of our capabilities:

     

    • We demonstrated the Nx Server, our extremely lightweight media server that makes connecting, consuming, and storing video streams as easy as possible – running on edge devices ranging from small, ARM32-based, body-worn cameras to larger edge boxes supporting hundreds of video streams simultaneously.
    • We showcased Nx Go: Specifically built for the transportation industry, Nx Go bundles our strengths in managing video streams, managing locations, and effectively overseeing a whole city’s transportation system through a unified portal, performing as an operating system for transportation.
    • We presented Nx EVOS, our Enterprise Video Operating System. Nx EVOS is the foundational layer for everything video - it is the platform that powers Nx Go, Nx Witness, and other products built with Nx. Building on Nx EVOS, and using the many developer tools available in the Nx Toolkit, we demonstrated how to build global-scale video applications. The latest addition to the Nx Toolkit, Nx AI Manager, makes deploying AI models on any video stream from an edge device effortless.
    • Robin van Emden provided several live coding sessions at our booth, demoing how to take an AI model trained to execute a vision task – for example, reading the state of an industrial gauge – and scale it globally in an instant. Using Nx EVOS and the tools in the toolkit, Robin demonstrated how the Nx Toolkit can take your solution form a trained model to multi-camera deployment, visualization, data management, and user management in minutes.

     

    In addition to the activities and demos at the booth, we also had the opportunity to give talks during EVS. On Wednesday morning, I had the honor to give a talk called “Scaling Vision-Based Edge AI Solutions: From Prototype to Global Deployment,” detailing the challenges the field faces in jointly growing the market and moving beyond proof of concepts. The talk also introduced OAAX (see below). Later on Wednesday, Robin van Emden provided a live coding session called “Building and Scaling AI Applications with the Nx AI Manager,” which was hands-on and developer-focused, demonstrating how to use the Nx Toolkit to build on Nx EVOS. Finally, on Thursday, Network Optix’s CEO Nathan Wheeler gave a talk called “Nx EVOS: A New Enterprise Operating System for Video and Visual AI,” detailing the history of EVOS and the necessity of having an operating system for your video across all verticals.

    Gen AI and “Generalist” Models

    Besides our own presence at the summit, there was a packed program of presentations, booths, and talks. There was a lot of discussion about the opportunities of generative AI, novel transformer networks, and multimodal “generalist” networks. Notably, on Wednesday, Yong Jae Lee gave an amazing talk on the opportunities of multimodal generalist models, highlighting the future of integrated language, audio, and video capabilities, “generic” learning, and out-of-the-box semantic integration. Multimodal LLMs were a big part of the technical buzz at the show, with a panel session on the topic hosted on Thursday.

    Through the discussions of generative AI, novel architectures, and novel hardware, there was a strong focus on making AI and vision applications work at the edge. The exhibition floor was packed with new (and established) silicon manufacturers; it was great to see advances in computational speed, energy usage, and usage costs, which jointly ensure that vision AI applications at the edge are becoming a reality.

    Building on EVOS: The Nx Toolkit

    This year’s EVS was special as it was our first public release of the Nx Toolkit, featuring Nx AI Manager. Nx partners have built all kinds of video and AI applications on top of our core video operating system for many years (Metadata SDK, Video Source SDK, Storage SDK, etc.). However, truly releasing the toolkit, including the new Nx AI Manager, which makes AI model deployment and vision pipeline management effortless, was a blast. We received lots of interest from the developer community exploring how to mature their video and AI proof of concepts into globally scaled enterprise products – exactly what we enable you to do within minutes using the Nx Toolkit. If you want to know more, check out our developer portal or contact us to get started.

    OAAX: Easy-to-Use Edge AI Portability

    Finally, during the summit, we also had a chance to launch and actively discuss OAAX: The Open AI Accelerator eXchange. OAAX, at its core, is a standardization layer – a common interface – to make trained model deployment to novel edge hardware as easy as possible. OAAX combines two things: First, a standardized interface surrounding the “OAAX Toolchain,” the process that takes a generically specified trained AI model (in ONNX format) and converts it into a specific format that can be run on the target hardware. The high-level interface gives accelerator manufacturers full flexibility to implement their own toolchains, while making usage for developers unified and easy. Second, a standardized model runtime interface, the “OAAX Runtime.” Here again, OAAX focuses on standardizing the interface such that developers can run inferences on any chip in one unified way, while providing full flexibility to manufacturers.

    At this point in time, OAAX consists of a number of toolchains and accompanying runtimes that we use internally; we have open-sourced their implementation and specification. In the next few months, building on the excitement raised during EVS, we will be building the organization surrounding OAAX to further mature this open project.

    Subscribe to Our Blog