Web Search has already been disrupted by AI — just take a look at how readily Google is presenting users with AI Overviews (summaries of search results) at the top of their results pages, how Bing early on integrated OpenAI’s GPT models, and how Perplexity continues to build on its own AI-driven web search platform and browsers.
Nimble announced the launch of its Agentic Search Platform, a system designed to transform the public web into trusted, decision-grade data for AI systems and business workflows.
The launch is supported by $47 million in Series B financing led by Norwest, with participation from Databricks Ventures and others, bringing the company’s total funding to $75 million.
The initiative addresses a fundamental bottleneck in the current AI era: while large language models (LLMs) are becoming more sophisticated, they often reason over incomplete or unverifiable external information. Nimble’s platform aims to eliminate this “guesswork gap” by providing a governed data layer that searches, navigates, and validates live internet data in real time.
In an exclusive interview with VentureBeat, Nimble co-founder and CEO Uri Knorovich reflected on the early skepticism regarding his vision of a machine-centric internet.
“Whenever we started this company, and the first time I went to investors, I told them the web is built for humans, but machines are going to be the first citizens of the web,” Knorovich recalled. He noted that while initial reactions labeled him as “too visionary,” the current reality of AI adoption has validated his thesis.
The core of Nimble’s solution is a proprietary distributed architecture that orchestrates specialized agents to perform tasks traditionally handled by human researchers or brittle web scrapers. According to the company’s infrastructure documentation, the process is broken down into five distinct layers:
Headless browser and browsing agents: These layers manage the initial interaction with a target domain, navigating complex site structures as a human would.
Parsing agents: These agents interpret the page content, identifying relevant data elements across various formats.
Data processing agents: This layer aggregates, filters, and cleans noisy internet data to produce specific, structured answers.
Validation agents: The final step involves verifying the results to ensure accuracy and completeness before delivery.
Unlike standard search engines designed for consumer link-clicking, this architecture uses multimodal and reasoning capabilities from frontier models—including those from OpenAI, Anthropic, and Meta—to control real browsers. This allows Nimble to navigate dynamic layouts and cross-check results, producing auditable data outputs rather than simple text summaries.
Knorovich points out that the scale of AI interaction with the web is fundamentally different from human behavior. “We, as humans, search for maybe three or five options before we making decisions… but every day, Nimble perform more than 3.2 million interactions in the web,” he explained. This sheer volume of billions of monthly searches represents a programmatic shift that requires a new type of infrastructure.
The bottleneck for enterprises today, according to Knorovich, isn’t the intelligence of the models, but the quality of the data they can access. “Agents are the headlines, and accurate and reliable web search is the bottleneck,” he stated.
Knorovich explicitly differentiates Nimble from general-purpose tools like Google or consumer AI search assistants.
While Google has built a search experience for consumers that is optimized for speed and finding a local restaurant, enterprises require high-scale, high-accuracy results to make multi-million dollar decisions.
“General purpose web search tool are great to have a general answers, such as who is the wife Leo missing,” Knorovich remarked during the interview. “But enterprises need deep, granular data, and they need to have the ability to control the search filters, to control the regulation, to control what is a trusted source”. Unlike consumer AI modes that may summarize a Reddit post or high-level news, Nimble provides “street-level” information that can be stored directly in an enterprise system of record.
The Agentic Search Platform is delivered through two primary interfaces designed for enterprise scalability:
Web search agents: A no-code AI workflow builder that enables business teams to describe the data they need and receive structured data streams without writing a line of code.
Web tools SDK: A suite of APIs for builders to search, extract, and crawl the web directly from their code. This includes specialized tools like the /crawl API for mapping entire domains and the /map API for creating domain trees.
The platform is built to deliver data with greater than 99% accuracy — meaning fewer than 1% inaccurate or hallucinated data for the total contents of each search result returned — and a latency of 1-2 milliseconds per request.
It integrates natively with major data environments, allowing users to stream clean data directly into Databricks, Snowflake, S3, or Microsoft Fabric.
During the interview, Knorovich emphasized that Nimble is designed to be model-agnostic, working seamlessly with state-of-the-art models from OpenAI, Anthropic, and Google’s Gemini. This flexibility allows companies to use Nimble alongside their existing tech stack, whether they are running models in the cloud or on-premise for high-security environments like healthcare or banking.
Knorovich provided several real-world examples of how this “street-level” data impacts professional workflows. For instance, a real estate broker looking to expand into a new territory doesn’t need a high-level summary from a general-purpose AI.
“If you want to know what’s happening in the commercial real estate in Atlanta… you’re not looking for search that’s optimized for the millisecond,” Knorovich explained. “You’re looking for street-level, neighborhood-level information… data that you can actually see on a table or download to Excel”.
Another use case involves major financial institutions utilizing Nimble for “know your customer” (KYC) processes. By deploying an autonomous search agent, banks can cross-reference multiple public reports, criminal records, and address verifications to build a complete profile of a client before they even enter the building. The goal, Knorovich noted, is to provide the “external truth” that exists outside an organization’s internal firewalls.
Nimble differentiates itself from legacy scraping tools through a rigorous focus on governance and trust. The platform is “compliant-by-design,” holding certifications for SOC2 Type II, GDPR, CCPA, and HIPAA.
Pricing is structured to support both experimental startups and high-scale enterprise operations, aligned with the volume and depth of data retrieved.
“Pricing should be aligned with the value that the user is getting… therefore, we are pricing by the amount of searches that you’re running,” Knorovich said.
Search and answer APIs: Standard search inputs cost $1 per 1,000, while the “Answer” function—which provides reasoning based on search results—costs $4 per 1,000.
Managed services: For larger organizations, managed tiers start at $2,000 per month (Startup) and scale to $15,000 per month (Professional) for unlimited agents and priority support.
Proxy access: A network of over 1 million residential proxies is available starting at $7.50 per GB
The transition to agentic search has already been operationalized by several Fortune 500 companies and AI-native startups:
Julie Averill, former CIO at Lululemon, stated that pricing intelligence which once took weeks to review can now be responded to in minutes by putting control in the hands of an agent.
Itamar Fridman, CEO and Co-founder of Qodo, noted that the platform’s scalability was “crucial in developing more robust and reliable AI systems” by feeding LLMs with high-quality data.
Dennis Irorere, Data Engineer at TripAdvisor, highlighted that the platform simplifies the extraction of structured data from complex sources, which he described as “transformative” for his role.
Grips Intelligence reported scaling to over 45,000 e-commerce sites using Nimble’s Web API to deliver real-time pricing and product data.
Alta utilizes the platform to power millions of AI-driven go-to-market workflows daily, reporting 3–4× deeper context and >99% reliability
The $47 million Series B funding announced alongside the platform will be used to accelerate research in multi-agent web search and further develop the governed data layer.
The round saw participation from a wide ecosystem of investors, including Target Global, Square Peg, Hetz Ventures, Slow Ventures, R-Squared Ventures, J-Ventures, and InvestInData.
Andrew Ferguson, VP of Databricks Ventures, noted that Nimble complements their Data Intelligence Platform by providing a “real-time web data layer” that extends workflows beyond internal sources. This strategic investment signals a shift in the industry toward prioritizing “external truth” to ground mission-critical AI applications.
For Knorovich, the future of the web belongs to programmatic interaction. “Programmatic web search is where we are building towards,” he concluded. By moving away from legacy data vendors and brittle scrapers, Nimble aims to provide the real-time structure needed for AI to act with confidence in the real world.
On Tuesday, Anthropic published tools that let Claude read, analyze and translate legacy COBOL into modern languages like Java and Python. By the end of the trading day, investors had wiped roughly $40 billion from IBM’s market cap — the company’s biggest single-day drop in 25 years — pricing the announcement as an existential threat to IBM’s mainframe business.
The reaction was swift. It was also built on a fundamental misreading of why enterprises run mainframes in the first place.
IBM’s COBOL is 66 years old. It was designed in 1959, runs on IBM mainframes, and continues to power transaction processing systems with an estimated 250 billion lines of COBOL in active production, according to the Open Mainframe Project.
The engineers who wrote it are retiring; the ones replacing them largely cannot read it. For decades, that skills gap has been one of enterprise IT’s most expensive unsolved problems — and one IBM has been working to fix with AI since at least 2023, when it launched watsonx Code Assistant for Z to help migrate COBOL to modern Java.
Claude Code, Anthropic says, can now analyze entire codebases, map hidden dependencies, and generate working translations of code that most engineers today cannot read. For enterprises running COBOL on distributed platforms — Windows, Linux and other non-mainframe environments — that capability is genuinely useful and increasingly practical.
“Modernizing COBOL has been a technically solved problem for a while,” Matt Braiser, analyst at Gartner, told VentureBeat. “The real problem is that the costs of modernization are high and the ROI is low.”
Amazon and Google have been offering AI-powered COBOL migration tools for years. AWS Transform and a comparable Google Cloud Platform service both targeted the same problem: reducing friction for customers looking to move mainframe workloads to the cloud.
“This is basically one more source of competition,” Raj Joshi, senior vice president at Moody’s Ratings, told VentureBeat. “IBM has always lived in a very competitive domain. On the margin, this thing is basically negative, no question about that. There’s one more powerful competitor. But IBM has coexisted with these threats.”
Steve McDowell, chief analyst at NAND Research, cuts to the structural argument: “Applications don’t run on mainframes because they’re written in COBOL,” he said. “They run on mainframes because mainframes deliver a class of determinism, scalable compute and reliability that general purpose servers can’t match.”
The issue runs deeper than market positioning. “GenAI tools are helpful, but their non-deterministic nature means the resulting code is not consistent — the same operation will be implemented in different ways in different parts of the code,” Braiser said. “Leading tools combine deterministic and non-deterministic approaches. None of this solves the ROI problem, though.”
“Translating COBOL is the easy part,” IBM communications director Steven Tomasco told VentureBeat. “The real work is data architecture redesign, runtime replacement, transaction processing integrity, and hardware-accelerated performance built over decades of tight software and hardware coupling. That is the problem IBM has spent decades learning to solve, and AI is the most powerful tool we have ever had to do it.”
According to IBM, Royal Bank of Canada, the National Organization for Social Insurance and ANZ Bank have all used watsonx Code Assistant for Z to accelerate modernization of COBOL code without moving off IBM Z.
That does not mean Anthropic has no competitive foothold. For enterprises running COBOL outside the mainframe — on distributed systems, Windows and Linux environments — Claude Code enters a space where IBM’s vertical integration is less of an advantage. “IBM understands mainframe technology at a level that others can’t match. If I’m only looking at COBOL, I’m using IBM’s watsonx,” McDowell said. “Anthropic, however, has a broader footprint within a lot of development teams, where a single vendor makes it worthwhile.”
Senior data and infrastructure engineers will spend the next few weeks fielding questions from executives who saw the headlines and assumed the hard problem just got solved. It did not.
“It’s COBOL, but there are numerous applications tied to it,” Joshi said. “It’s not like you transform millions of lines and somehow you are ready to go to cloud. It’s a massive risk assessment, dependencies and all those things.”
The more useful question for buyers is whether this week’s noise creates an opening. Braiser thinks it does.
“They should use the resulting board-level and shareholder discussions to review postponed modernization initiatives and see if any of them now have ROI,” Braiser said.
McDowell was blunt on the competitive question. “Will Anthropic take business from IBM’s tool? Yes, of course,” he said. “But I’d be surprised if that tool was making significant revenue for IBM.”
Chirag Mehta, analyst at Constellation Research, cautioned that IT leaders should not react emotionally or rewrite strategy overnight.
“Treat this as a reason to run a small, bounded pilot to measure outcomes, not as a reason to rip and replace vendors,” Mehta told VentureBeat.
Mehta suggests that enterprises pick one well-scoped application slice or workflow with clear inputs and outputs, and evaluate approaches apples-to-apples: quality of dependency mapping, quality of recovered business logic documentation, test coverage and equivalence checks, performance and reliability regressions.
In Mehta’s view, the bigger reminder is that modernization is more than converting code. The hard parts are extracting institutional knowledge, reworking processes and controls, change management, and containing operational risk in systems that cannot break. AI can compress the “analysis and translation” work, but it does not eliminate the governance and accountability burden.
“The teams that win will treat AI as an accelerator inside a disciplined modernization program, with measurable checkpoints and risk guardrails, not as a magic conversion button,” Mehta said.
NordVPN is our recommended VPN service for uninterrupted streaming, and offers complete online security and convenience, now with a $415 discount and a $50 Amazon voucher.
…
New deal is designed to “accelerate the growth” of Panasonic’s TV business.
Initial deployments are scheduled to begin in 2H 2026, employing custom MI450-class GPUs optimized specifically for Meta’s AI training and inference workloads.
An outline of what drone Obstacle Avoidance is, how it works and how to use it effectively for your drone flights.
SanDisk has unveiled the next generation of its portable SSD portfolio, The new three-tier lineup supports larger file sizes, AI-developed content and digital workflows.
Keyboard brand Keychron has released two new special editions of its K2 HE keyboards made with a choice of concrete or resin for a product with a material difference.
For decades, software companies and services firms have made poor partners. There are very few examples of software businesses that have thrived long term inside services firms or vice versa.
The XTRFY MX 8.2 Pro TMR Wireless keyboard ses TMR technology. Each switch has an embedded magnet that’s detected by a TMR magnetic sensor for improved response.