Redefining AI for Edge Computing
“Our fundamental thesis is the future of AI models will be smaller, domain-specific, and on-device LLMs that operate at the Edge with zero inference costs for OEMs and keep the data private and secure.“
The north-star vision of EdgeLens AI is to build smaller and highly customized AI models for Edge devices with low computing resources and power, ensuring low latency and reducing inference costs for device manufacturers by keeping processing fully on the devices.