Press Release, 26 November, 2025
Generative AI: Breakthroughs, Barriers, and What’s Next
In a panel chaired by Soma Pirityi, CEO & Founder at ETA Technologies, Nadav Eiron, Senior Vice President at Cloud Engineering, Frederik Heda, Data & AI Lead at BP, Lourens Walters, Senior Data Scientist at Informa, and Eleana Kafeza, Lead Research at the Technology Innovation Institute, discussed the evolving landscape of generative AI. This session explored the latest developments in the market, innovations in tech, and the challenges involved with model training, deployment, security, scalability and efficacy.
The Five Key Takeaways from this Session:
Short on time? We’ve summarized the top five takeaways from the panel below!
- Edge AI Democratizes Access: Smaller, more efficient models like TII's Falcon series (3.5B parameters matching 7-10B parameter performance) enable powerful AI to run directly on phones, watches, and laptops with just 1GB memory footprint, making AI accessible beyond traditional computing platforms.
- Infrastructure Gap Blocks Enterprise Adoption: Current AI infrastructure remains "rudimentary" and built for research rather than production, lacking the abstractions, lifecycle management, and integration standards that enterprises need for large-scale deployment.
- AI Evolution from Assistant to Autonomous Operator: Enterprise AI has moved beyond chatbots to agent frameworks that can autonomously execute entire campaigns, optimize operations, and continuously improve through self-learning feedback loops across multimodal capabilities.
- Ambient AI Eliminates Technical Barriers: Natural language interfaces will allow users to ask complex business questions without technical expertise, while physical AI systems will provide real-world guidance and teaching capabilities within 3-5 years, particularly benefiting small and medium enterprises.
- Open Source Distribution Accelerates Innovation: Making powerful AI models freely available creates an ecosystem "where AI is for everybody," enabling widespread development and deployment, though success still requires better abstractions to move beyond specialist-only accessibility.
The below article has been created using a transcription of the video above, showing the panel discussion on Generative AI: Breakthroughs, Barriers, and What’s Next which took place at The AI Summit London on the Next Generation Stage on 11 June, 2025.
Foundation Models Are Shrinking Dramatically While Maintaining Competitive Performance
The next frontier lies in edge deployment, where AI runs directly on phones and watches. This shift matters because smaller models democratise AI access. Enterprises can deploy powerful capabilities without massive infrastructure, while developers gain tools that run on personal devices. Eleana Kafeza from TII explains how their Falcon H1 series combines transformer and MA architectures to achieve breakthrough efficiency. The 3.5 billion parameter Falcon model matches the performance of models with 7-10 billion parameters. Users can now run state-of-the-art AI on laptops.
Why Edge AI Deployment Represents the Next Breakthrough in Mobile Computing
Edge deployment represents the next leap forward. TII's Falcon Edge series achieves just 1GB memory footprint while outperforming models with 0.2-0.5 billion parameters. This enables deployment on phones and watches, expanding AI's reach beyond traditional computing platforms.
How Open-Source AI Distribution Accelerates Innovation Across Industries
Open-source distribution amplifies these advances. Kafeza emphasises that making these models freely available creates an ecosystem where "AI is for everybody". Developers can download and deploy these tools immediately, accelerating innovation across industries.
Key Takeaway
Smaller, more efficient foundation models are breaking down barriers to AI adoption. Edge deployment and open-source distribution mean powerful AI capabilities will soon run on everyday devices.
Breaking Through AI Barriers: Infrastructure and Evaluation Challenges: The Critical Gap Between AI Research Infrastructure and Enterprise Production Needs
Today's AI infrastructure remains stuck in research mode. Production demands a complete rethink. Why does this matter? Companies racing to deploy AI face a stark reality: the tools built for laboratory experiments cannot handle enterprise-scale operations. The gap between research infrastructure and production needs threatens to slow AI adoption across industries.
Why Current AI Infrastructure Remains "Rudimentary" for Enterprise Deployment
Nadav Eiron from the panel "Generative AI: Breakthroughs, Barriers, and What's Next" describes current AI infrastructure as "rudimentary". Researchers built these systems for maximum control and visibility into machine internals. Production environments need the opposite: abstractions and lifecycle management capabilities that simply don't exist yet.
Technical Barriers and Integration Standards Blocking AI Enterprise Adoption
Technical barriers compound these infrastructure challenges. Huseyn Gorbani identifies context window limitations as a major constraint. Integration standards like MCP and A2A protocols remain in their infancy. These protocols promise standardisation but lack the maturity enterprises require.
Register for The AI Summit New York 2025 or browse our conference agenda to see more information about our upcoming talks and speakers.
Enterprise AI Deployment Complexity: Beyond Basic Model Training
Enterprise deployments face mounting complexity beyond basic model training. Eiron observes customers increasingly concerned with data movement, multi-cloud presence, security, and manageability. Model development sits at the center of an expanding ecosystem where surrounding infrastructure determines success or failure.
Key Takeaway
AI's infrastructure gap - built for researchers, not operators - blocks enterprise deployment at scale. Companies need production-ready abstractions and mature integration standards before AI can deliver on its promise.
How AI Agent Frameworks Are Transforming Enterprise Operations Beyond Chatbots
Enterprise AI has moved beyond simple chatbots. Agent frameworks now automate entire campaigns and execute operational tasks autonomously. This shift matters because it transforms AI from a conversational tool into an operational engine. Companies can deploy systems that act, not just respond.
Multimodal AI Integration: Combining Text, Images, Vision and Sound in Agent Systems
Lourens Walters from the panel "Generative AI: Breakthroughs, Barriers, and What's Next" highlighted how multimodal AI combines text, images, vision and sound within agent frameworks. This fusion expands what AI can achieve operationally. The technology moves from answering queries to generating content, optimizing SEO, and building infrastructure that improves itself.
Autonomous AI Marketing Systems: Real-Time Campaign Optimization and Execution
Marketing shows particular promise for these autonomous systems. AI agents could optimise and execute campaigns without human intervention. They could analyse performance data, adjust strategies, and deploy new content in real time. The agentic infrastructure could evolve continuously. Each interaction teaches the system, creating a feedback loop that enhances performance. Companies could therefore gain tools that grow smarter through use, not just through updates.
Key Takeaway
Agent frameworks represent AI's evolution from assistant to operator, enabling autonomous execution of complex business tasks with continuous self-improvement.
Get a flavour of what’s in store for The AI Summit New York 2025.
The Future of Ambient AI: What's Next for Business OperationsHow Ambient AI Will Transform Business Operations Through Natural Language Processing
Ambient AI will soon permeate every corner of business operations. Open source technologies now enable widespread development and integration across industries. This shift matters because complexity vanishes. Eleana Kafeza explains how users will simply ask natural language questions, like "Is the German market ready for my business?" and sophisticated backend processes will deliver answers. No technical expertise required.
Why Small and Medium Enterprises Will Benefit Most from Ambient AI Technology
Small and medium enterprises stand to gain most from this transformation. They'll access powerful analytical capabilities without hiring specialists or learning complex tools. The playing field levels as ambient AI democratizes advanced business intelligence.
Physical AI Systems: The Next Evolution in Real-World Business Guidance
Physical AI systems will extend this revolution beyond screens. Huseyn Gorbani predicts robots and embodied machines will provide real-world guidance within three to five years. These systems won't just execute tasks—they'll explain problems and teach users to solve them independently. The future points toward AI that empowers rather than replaces human capability. Systems will visualise problems, demonstrate solutions, and build user competence for long-term self-sufficiency.
Key Takeaway
Ambient AI will transform business operations by eliminating technical barriers and enabling natural language interactions with sophisticated analytical systems. Small businesses will particularly benefit as complex capabilities become accessible to all.
Conclusion: Preparing for the Generative AI Transformation
The Future of Distributed AI: Small Models Deployed Everywhere
The future of generative AI lies in small models deployed everywhere. Open source ecosystems democratise access and accelerate innovation. Eleana Kafeza champions this distributed approach. Small language models running across devices enable generic solutions with enhanced orchestration capabilities. Open source models create an ecosystem "where AI is for everybody", she argues, allowing anyone to download and build upon existing work.
Want to be part of talks like this one? Register for The AI Summit New York or browse our conference agenda to see more information about our upcoming talks and speakers.
Overcoming AI Adoption Barriers: The Need for Better Abstractions and Standards
Yet significant barriers remain. Nadav Eiron identifies the lack of proper abstractions as AI's biggest adoption obstacle. Current development demands expertise levels "we should not expect of everybody". Better abstractions would unlock broader participation and innovation.
Enterprise AI Challenges: Multi-Cloud Security and System Management Complexity
Enterprise concerns have shifted beyond basic model training. Companies now grapple with data movement across multi-cloud environments, security protocols, and system manageability. The AI infrastructure ecosystem grows increasingly complex. Success requires thinking beyond individual models. Long-term AI adoption needs commonalities and abstractions that don't yet exist. Without these foundations, the technology remains accessible only to specialists.
Key Takeaway
Generative AI's transformation hinges on making powerful models accessible through open source distribution and better abstractions. The shift from centralised to distributed deployment marks a crucial evolution in how organisations will build and maintain AI systems.


































































































