2 min readfrom Machine Learning

Sharing all KGC 2026 decks. More production-grade KG systems than I've seen at any conference. [D]

Our take

I missed the Knowledge Graph Conference in New York this year, but I caught some insightful presentations virtually and have compiled all the decks for you. This year's event showcased numerous production-grade knowledge graph systems, featuring enterprises like Bloomberg and AbbVie, demonstrating real-world applications rather than mere proofs of concept. For instance, AbbVie presented their ARCH system, integrating a scoring engine and an LLM for intuitive queries.

The recent Knowledge Graph Conference (KGC) in New York showcased a significant shift in the approach to knowledge graph technology, as highlighted by an article summarizing the event. Although many of us could not attend in person, the insights shared through the decks available online provide a valuable glimpse into the current state of production-grade knowledge graph systems. The emphasis on live deployments, rather than mere proofs of concept, marks an encouraging trend in the AI landscape. This is particularly crucial as enterprises are increasingly challenged to meet real compliance requirements with concrete engineering solutions.

Notably, organizations like Bloomberg, AbbVie, and Morgan Stanley presented their innovative applications of knowledge graphs, demonstrating how these systems serve as robust infrastructures rather than simple retrieval layers. Bloomberg's formal dependency model for ontology governance and AbbVie's ARCH system, which integrates drug and disease-area intelligence with an LLM companion, illustrate a holistic approach where knowledge graphs are the source of truth. By connecting to a scoring engine and a researcher dashboard, these systems empower users to navigate complex data landscapes with greater ease. Meanwhile, Morgan Stanley's automated SHACL drift detection highlights the proactive capabilities of knowledge graphs in maintaining the integrity of semantic layers. This shift from merely looking up information to leveraging knowledge graphs for reasoning represents a significant evolution in how organizations can harness data.

The implications of these advancements are profound. As enterprises increasingly adopt knowledge graphs for operational effectiveness, we can expect a broader discourse around the limitations of traditional vector databases. The skepticism towards the "only using vector DBs" framework is well-founded, as the KGC presentations clearly demonstrated the operational realities of using knowledge graphs. This dialogue is critical for professionals in the data management space, prompting a reassessment of how we approach data architecture and integration. For those interested in exploring related developments, articles like Learning, Fast and Slow: Towards LLMs That Adapt Continually and AWS WorkSpaces Now Lets AI Agents Operate Legacy Desktop Applications Without APIs complement this narrative by delving into the evolving capabilities of AI and data systems.

As we look ahead, the drive towards more integrated and intelligent systems will only intensify. Organizations that embrace knowledge graphs as foundational elements of their data infrastructure will likely find themselves at a competitive advantage. The question remains: how will these advancements in knowledge graph technology influence the broader landscape of data management and AI? As we continue to witness the successful deployment of these systems, we should keep a keen eye on how they redefine productivity and decision-making processes across industries. The future of data management is unfolding, and it invites us all to explore its potential.

Didn't make it to New York for the Knowledge Graph Conference this year, but caught some talks virtually and managed to download all the decks. Sharing them below because some of what was shown is worth knowing about.

Majority of the presentations described live production systems. Enterprises showing up with real engineers delivering real compliance requirements. That's not usual for most ai eventss. Most talks are proofs of concept with a "coming soon to prod" slide at the end.

For eg - Bloomberg showed a formal dependency model for ontology governance. AbbVie walked through ARCH, their internal KG for drug and disease-area intelligence, connected to a scoring engine, a researcher dashboard, and an LLM companion for plain-language queries. The KG is the source of truth. The LLM is the interface. Even Morgan Stanley showed continuous SHACL drift detection on risk reporting data - automated weekly checks that alert when the semantic layer deviates from what's governed.

Crux: knowledge graphs are being actively used as infrastructure, not a retrieval layer on top of vectors. The graph is doing reasoning work, not lookup work.

We've been skeptical of the "only using vector dbs" framing for a while. These production systems are the clearest evidence I've seen of where that breaks down - and what the alternative actually looks like when it's running. Link to the all the decks in the comment.

All decks here:

https://drive.google.com/drive/folders/1Csdv4hZePrBMJGggsisPXYBueTRCK1kV?usp=sharing

submitted by /u/Ok_Gas7672
[link] [comments]

Read on the original site

Open the publisher's page for the full experience

View original article

Tagged with

#natural language processing for spreadsheets#generative AI for data analysis#Excel alternatives for data analysis#rows.com#real-time data collaboration#automated anomaly detection#financial modeling with spreadsheets#real-time collaboration#google sheets#big data management in spreadsheets#conversational data analysis#business intelligence tools#intelligent data visualization#natural language processing#data visualization tools#enterprise data management#big data performance#data analysis tools#data cleaning solutions#Knowledge Graph Conference