Former Visiting Professor and Executive Director SCI at Massachusetts Institute of Technology

Author: zeier

  • Architectural Advances in Real-Time SAP Systems – From In-Memory Computing to Augmented Agentic AI

    Honorary Professor of Computer Science (AI & Intelligent Enterprise Systems), University of Magdeburg – https://imta-ovgu.de/

    Over the past decades, enterprise systems have undergone several fundamental architectural shifts. Traditional database-centric architectures, designed for transactional consistency and delayed reporting, reached their limits as enterprises increasingly required real-time insight, system-wide optimization, and immediate operational response.

    Early research and industrial work on real-time databases, in-memory data processing, and large-scale optimization — including supply chain planning and execution — demonstrated that meaningful business decisions cannot be derived from delayed aggregates alone. Competitive advantage emerges when complete enterprise data sets can be processed, analyzed, and acted upon in memory, at sub-second latency, across both transactional and analytical workloads.

    This line of research laid the foundation for a new generation of enterprise systems.

    The introduction of SAP HANA represented a decisive architectural break. By eliminating disk-bound constraints and unifying transactional and analytical processing on a single in-memory architecture, HANA enabled real-time visibility into operational data at unprecedented scale. For the first time, enterprises could evaluate end-to-end processes — such as supply chains, logistics, pricing, and production — directly on live data rather than relying on delayed snapshots.

    This architectural shift fundamentally changed how enterprise systems were designed and operated. Optimization could move from periodic planning cycles to continuous, system-wide decision-making based on the complete and current state of the enterprise.

    SAP S/4HANA subsequently translated these architectural principles into a new generation of enterprise applications. While the data platform and core application layers were fundamentally re-architected, one structural challenge remained largely unresolved: decades of historically grown SAP custom code.

    In many large enterprises, custom extensions accumulated across multiple system generations, often optimized for earlier architectures and tightly intertwined with business logic. While functionally essential, this custom code increasingly became the primary driver of complexity, upgrade risk, transformation cost, and long-term total cost of ownership.

    SAP later introduced the concept commonly referred to as Clean Core to describe the architectural objective of stabilizing the core while enabling differentiation through extensions. Architecturally, however, achieving this objective requires more than guidelines. It demands deep system understanding, high-performance interfaces aligned with in-memory execution, and a fundamentally new approach to analyzing, restructuring, and governing custom code over time.

    Recent advances in augmented agentic AI systems now make it possible to address this challenge at architectural depth.

    Augmented agentic AI combines autonomous reasoning agents with human expertise, enabling continuous analysis and optimization of complex enterprise systems. Rather than treating custom code as a one-time transformation problem, agentic systems can operate across the entire lifecycle of SAP landscapes — understanding business logic in context, assessing architectural fit, identifying optimization opportunities, and supporting ongoing modernization decisions.

    In this sense, agentic AI represents a new architectural instrument. Similar to how in-memory computing transformed enterprise data processing, augmented agentic AI enables enterprises to reason about, restructure, and continuously improve their systems at scale.

    This current work builds on more than three decades of research, architectural invention, and large-scale execution.

    Following early research in real-time databases, optimization, and supply chain systems, the initiation of the SAP HANA project marked a critical inflection point. As co-initiator of HANA, the focus was the invention of a new architecture capable of processing complete enterprise datasets in real time.

    After the introduction of HANA, the architectural challenge shifted from invention to execution at scale. During subsequent years as Managing Director and Global CTO for SAP at Accenture, these architectural principles were translated into enterprise reality across hundreds of global SAP and S/4HANA implementations. This work made clear that sustainable system quality, stability, and economic viability cannot be achieved through isolated projects alone, but require continuous architectural governance across the full lifecycle of enterprise systems.

    In 2024, together with Emma Qian and Sam Yang, I founded Nova Intelligence to apply these architectural principles in practice.

    together with Emma Qian and Sam Yang
    Nova Founding Team, Emma Qian, Sam Yang, and Prof. Dr. Alexander Zeier

    Nova Intelligence provides an end-to-end platform for SAP custom code lifecycle management, built on augmented agentic AI systems. The platform analyzes SAP landscapes, understands business logic in context, and supports the agentic ai-based re-composing & restructuring and continuous optimization of custom extensions in architectures optimized for high-performance, in-memory execution.

    Rather than replacing human expertise, the platform augments it. Architects and developers are supported by AI agents that operate continuously across analysis, transformation, optimization, and long-term governance — delivering immediate benefits while remaining relevant over many years of system evolution.

    The platform is already being applied successfully with customers, demonstrating that continuous, architecture-aware optimization of SAP custom code is no longer a theoretical concept, but a practical and sustainable capability for enterprises operating complex SAP landscapes.

    Further information on this ongoing work can be found at: https://www.novaintelligence.com

    Posted in


  • Ceremony with Dr. Reiner Haseloff , Minister President of Saxony-Anhalt, Germany.

    Opening Event of Enterprise Cloud Initiative and Cloud Academy founded by Google Cloud, Accenture and Otto-von-Guericke University Magdeburg.

    Welcome address by the Minister President of Saxony-Anhalt, Dr. Reiner Haseloff, the scientific head of the cooperation Prof. Dr. Klaus Turowski presented the key points of the initiative.

    Bei der Eroffnung der CloudAcademy Bild

    The welcoming ceremony was held in a pleasant atmosphere, and high-ranked representatives of the cooperation partners joined.  This includes Global Managing Director and Chief Technology Officer Prof. Dr. Alexander Zeier and Managing Director Intelligent Platform Services and SAP Business Group ASGR Dirk Appelhoff of Accenture, Director of Government Affairs & Public Policy for Germany Eveline Metzen and Partner Manager Achim Ramesohl of Google Cloud as well as Prof. Dr. Klaus Turowski. After a warm welcome speech from the university’s rector, Prof. Dr. Jens Strackeljan.

    With cloud platforms in mind as “the next big step”, the initiative targets a variety of endeavors in technological, economic, and scientific terms.

    While the Cloud Academy aims to prepare the next generation of the workforce for the increasingly urgent need for cloud computing expertise, the network of renowned companies serves to explore promising new research directions and opportunities, such as the use of cloud technologies for event-based retailing, real-time campaign management, near-real-time quality assurance, and fraud detection.

    More Details on the Website of my honorary Professorship: https://imta-ovgu.de/cloudfirst/

    Press release (German): https://www.ovgu.de/Presse+_+Medien/Pressemitteilungen/PM+2021/November/PM+64_2021-p-118958.html

    Posted in


  • UNIVERSITY OF MAGDEBURG APPOINTED DR. ALEXANDER ZEIER AS HONORARY PROFESSOR

    Students will learn about promising In-Memory Technology

    Students learning

    Dr. Alexander Zeier held his inaugural lecture on June 26th, 2013 as first honorary professor at the Faculty of Computer Sciences at the Otto von Guericke University Magdeburg. He will teach in-memory technology and applications.

    Dr. Alexander Zeier

    In-memory technology manages and organizes huge data bases, and enables companies to analyze large amounts of data and thus react in real time. Production, distribution, shipping and product sales create large amounts of data, which can be stored in data bases. Going forward all relevant questions such as “what was shipped when and where to whom“ can be answered and displayed on mobile devices in real time by analyzing operational mass data.

    With the appointment of Alexander Zeier as a honorary professor, students of computer sciences at the University Magdeburg have the possibility to learn about the latest software developments and to receive a sound education in this field. „We foresee a huge demand for experts with capabilities in this technology and are pleased that the University Magdeburg has recognized this important field of education for their students and will focus on in-memory technology“, said Zeier.

    Dr. Alexander Zeier teaching

    Alexander Zeier

    Dr. Alexander Zeier is managing director of in-memory solutions at the global management consulting, technology services and outsourcing company Accenture. In this function, he works globally with clients to develop solutions based on in-memory technology. Dr. Zeier has been working with SAP technologies and solutions for over 20 years. He is co-inventor of ten patents filed regarding in-memory technology and applications for enterprise systems, and is co-author of the recently published book, “In-Memory Data Management – Technology and Applications”. During his academic career, Dr. Zeier has published more than 150 technical articles and seven books.

    Dr. Zeier received an Dipl-Kfm/MBA from the University of Wuerzburg. He completed his studies in information technology at the Chemnitz University of Technology and gained his Ph.D. in Supply Chain Management Systems at the University of Erlangen-Nuremberg. Dr. Zeier has been a Visiting Professor in Residence at the Massachusetts Institute of Technology (MIT) focusing on in-memory technology and applications.

    German Text Website University Magdeburg

    Facebook News University Magdeburg for Appointment Honorary Professor Alexander Zeier

    Posted in


  • Chinese Edition of the In-Memory Data Management/HANA book by Hasso Plattner and Alexander Zeier, is available now:  translated, lincensed and published by Tsinghua University Press, the Press of Tsinghua University,  #1 Elite Univ. in China for Business Applications.

    Chinese Edition of the In-Memory Data Management/HANA book

    Posted in


  • Dr. Alexander Zeier Joins Accenture as Global Lead for In-Memory Solutions

    Will Also Serve as Director of Programs for SAP HANA® Within the Accenture Innovation Center for SAP® Solutions

    MADRID; November 13, 2012 – Accenture (NYSE: ACN) today announced the appointment of Dr. Alexander Zeier as managing director of In-Memory Solutions. In this capacity, Zeier will work with Accenture clients to develop in-memory solutions, provide sales support to global industry teams in leveraging in-memory technology, and provide ongoing thought leadership. He will also serve as director of programs for the SAP HANA® platform within the Accenture Innovation Center for SAP® solutions.

    Dr. Zeier has been working with SAP technologies and solutions for over 20 years. Prior to joining Accenture, he was responsible for SAP’s first large in-memory application, and was a pivotal part of the development of SAP HANA, SAP’s platform for real-time analytics and applications. He holds ten patents related to in-memory technology and applications for enterprise systems, and is co-author with Hasso Plattner of the recently published book, “In-Memory Data Management – Technology and Applications.”

    “Alexander brings an incredible and unrivaled depth of expertise in the areas of analytics and in-memory technology,” said Christophe Mouille, global managing director of SAP business for Accenture. “Alexander possesses a truly unique understanding of the business value that can be unlocked through the power of in-memory computing. Our clients will benefit from his extensive experience in researching and developing in-memory technology that turns massive amounts of customer data into actionable insights.”

    “Accenture has been committed to developing solutions based on in-memory technology for several years, and I am excited to be joining this team to drive further innovations in this important strategic area,” said Zeier. “Companies are relying on transactional and analytical data more than ever. The capabilities enabled by in-memory computing combined with Accenture’s vast industry knowledge will result in better, faster insights and new innovative business processes for our clients.”

    Since March 2012, Dr. Zeier has been a Visiting Professor in Residence at the Massachusetts Institute of Technology (MIT), lecturing and conducting research around innovative enterprise applications and business process optimizations that leverage in-memory technology or SAP HANA. He was also deputy chair, Enterprise Platform and Integration Concepts, at the Hasso Plattner Institute in Germany, focusing on real-time, in-memory enterprise systems and RFID technology.

    Dr. Zeier received an MBA from the University of Würzburg. He completed his studies in information technology at the Chemnitz University of Technology and gained his Ph.D. in Supply Chain Management Systems at the University of Erlangen-Nuremberg.

    About Accenture
    Accenture is a global management consulting, technology services and outsourcing company, with more than 257,000 people serving clients in more than 120 countries. Combining unparalleled experience, comprehensive capabilities across all industries and business functions, and extensive research on the world’s most successful companies, Accenture collaborates with clients to help them become high-performance businesses and governments. The company generated net revenues of US$27.9 billion for the fiscal year ended Aug. 31, 2012. Its home page is www.accenture.com.

    Source: link

    Posted in


  • Review: Written by Sam Sliman, President, Optimal Solutions

    July 23, 2012

    Understand Big Data and the Next Generation of IT
    If you are looking to understand the next generation of information technology and the hype surrounding “Big Data”, look no further than Hasso Plattner and Alexander Zeier’s groundbreaking book, “In-Memory Data Management: Technology and Applications”. As the authors demonstrate, processor power and memory capacity have reached inflection points where we can realistically envision a new world of applications built on in-memory technology that are capable of crunching massive amounts of data in real time. They articulate for us just how critical in-memory breakthroughs are to this next phase of technological progress and describe in fascinating detail the basic design of a prototype columnar storage database that makes use of advanced compression algorithms and multi-core processing to turbocharge both analytical and transaction processing systems. It is easy to envision the new world of enterprise, analytical, cloud and mobile applications that leverage in-memory technology to put information and insight into the hands of users instantaneously. And in 2012 few can disagree with the last chapter, “The In-Memory Revolution Has Begun”, when you see the very real-life success of the SAP HANA platform that is based on the concepts in this revolutionary book. It should be required reading for anyone working at the intersection of business and technology today.

    This review is for: In-Memory Data Management: Technology and Applications (Hardcover). Direct Amazon.com Link to Review Page

    Posted in


  • Vorwort für dt. Ausgabe des In-Memory-Buchs (sorry only in German, published in June 2012)

    Title: Grenzen überwinden
    In-­Memory Data Management ist eine bahnbrechende Innovation, die unser aller Leben fürund mit IT verändern wird.
    Was ist so bahnbrechend daran, warum jetzt und warum von SAP bzw. für ERP-­Anwen-­
    dungen? Dieses Buch von Hasso Plattner und Alexander Zeier wird alle diese Fragen be-­
    antworten. Als kleinen Vorgeschmack einige wesentlichen Dinge:
    Schnelles Computing großer Datenmengen an sich ist kein wirklich neues Thema. In
    Echtzeit erhobene Daten, wie z. B. Wetterdaten oder Daten aus dem Banken-­ und Versicherungssegment, und ihre Analyse haben den Einsatz von Großrechnern mit sehr großer Prozessor-­ und Hauptspeicherkapazität schon immer erforderlich gemacht. Auch bei der Bewältigung sehr großer Datenmengen in der Bildverarbeitung wurden in den letzten 20 Jahren bereits einige Meilensteine erreicht. Die Erfolgsgeschichte von Pixar – animierte Kinofilme in nahezu „True Reality“ – sind ein gutes Beispiel dafür, wie der technologische Fortschritt Geschäftsmodelle beeinflussen oder gar erst möglich machen kann.
    Laut einer Studie von IDC wurden im Jahr 2011 weltweit 19 Million Terabytes an Daten
    erzeugt und repliziert. Darunter sind meiner eigenen Schätzung nach ca. 0,5 bis 1 Million
    Terabyte SAP-­Daten; und zwar wesentliche Daten aus den Bereichen Vertrieb, Marketing,
    Produktion und Finanzen – also die Schlüsseldaten jedes Unternehmens. Genau diese sollten in Echtzeit korreliert und analysiert werden können. Der Trend zu „Big Data“ hält also ungebrochen an, und ERP-­Daten sind ein Teil davon.

    Wo stehen wir heute?
    Aufgrund technischer Limitierungen wurden große Datenmengen bisher asynchron, also
    im Hintergrund, aufbereitet und in Business-­Analysen, Auswertungen wie SAP CO-­PA
    (Controlling-­Profitability-­Analysis) oder Kalkulationsläufen für die Produktion großer An-­
    lagen verarbeitet. Die Nachteile fehlender Echtzeit, doppelter Datenhaltung, komplizierter
    Anpassungen von Schemata und davon abhängenden Benutzeroberflächen mussten zwangsläufig in Kauf genommen werden.
    Dateibasierte Datenbanken – auch unter SAP – auf langsamen Datenträgern sind letztlich
    der Grund für die fehlende Performance. Diese könnte zwar mit neuen Speichermedien wie Solid State Drives (SSD) verbessert werden, jedoch sind diese Lösungen noch nicht wirtschaftlich sinnvoll einsetzbar. Der Preis pro Speicherkapazität und ihre noch nicht ausgereifte Langzeitstabilität verhindern immer noch einen flächendeckenden kommerziellen Business-­Einsatz. Darüber hinaus verhindern klassische Datenbanken mit zweidimensionalen Tabellen einen schnellen Datenzugriff – und dies trotz aller Bemühungen der Datenbankhersteller im Bereich der Indizierung, des Datenbank-­Cachings und der Datenkompression.
    Und hier kommt nun In-­Memory-­Computing ins Spiel.

    Neue Chancen für das Business
    Die Vorteile des In-­Memory-­Computings sind riesig. Begünstigt durch die technologische
    Entwicklung günstigerer Rechnermodelle mit speziellen Intel-­Prozessoren, Multi-­Core-­
    Architekturen und RAM im Terabyte-­Bereich lassen sich heute zu vergleichbar niedrigen
    Preisen sehr große Datenmengen im Hauptspeicher vorhalten und verarbeiten. Der Datenzugriff auf einen Speicherblock im Arbeitsspeicher ist dabei rund 2.000-­mal schneller als der auf einer klassischen Festplatte.
    Ein weiteres sehr wichtiges Merkmal von SAP HANA ist die spaltenorientierte Ablage,
    die einen ressourcen-­schonenden – und damit schnelleren – Zugriff auf die Daten erlaubt und ebenso eine geniale Möglichkeit bietet, strukturierte Daten sehr hoch zu komprimieren.
    Dort, wo das Business durch limitierende Faktoren in der Verarbeitungsgeschwindigkeit
    von Daten nur eingeschränkt kreativ sein konnte, eröffnen sich mit dieser Technologie neue Perspektiven. Die Grenzen zwischen transaktionaler Welt und dem Reporting werden verschwinden. Und auch die Datenqualität und der Reichtum an Daten auf mobilen Geräten werden deutlich zunehmen, da die Barrieren zwischen serverseitig langsamem Computing und den Anforderungen an eine schnelle Visualisierung fallen werden. Bestehende und ganz neue Applikationen werden entstehen;; sie lassen sich viel benutzerfreundlicher entwickeln, wenn all diese Limitierungen gefallen sind. In anderen Worten: „Alle Daten, verfügbar jederzeit, in jeglicher Dimension und in jeder Aggregationsstufe und mit ganz wenigen Clicks“ – etwas, das wir uns alle schon lange gewünscht haben als eine Art „besseres Google“ für strukturierte und unstrukturierte Unternehmensdaten.
    Dr. Rudolf Caspary, CTO REALTECH AG, Walldorf

    Posted in


  • Today and Tomorrow
    Imagine you live in New York City. Now, imagine that every time you want a glass of water, instead of getting it from the kitchen, you need to drive to the airport, get on a plane and fly to Germany and pick up your water there. From the perspective of a modern CPU, accessing data which is in-memory is like getting water from the kitchen.
    Accessing a piece of data from the computer’s hard disk is like flying to Germany for your glass of water. In the last 30 years the prohibitive cost of main memory has made the flight to Germany necessary. The last few years, however, have seen a dramatic reduction in the cost per megabyte of main memory, finally making the glass of the water in the kitchen a cost effective and much more convenient option.
    This orders-of-magnitude difference in access times has profound implications for all enterprise applications. Things that in the past were not even considered because they took so long, now become possible, allowing businesses concrete insight into the workings of their company that previously were the subject of speculation and guess-work. The in-memory revolution is not simply about putting data into memory and thus being able to work with it “faster”.

    It shows the convergence of two other major trends in the IT industry also:
    a) The advent of massiv multi-core CPUs and the necessity of exploiting this parallelism in software, and
    b) The stalling access latency for DRAM, requiring software to cleverly balance between CPU and memory activity; have to be harnessed to truly exploit the potential performance benefits.

    Another key aspect of in-memory/HANA for real-time enterprise apllications, is a change in the way data is stored in the underlying database. This is of particular relevance for the enterprise applications that are our focus.

    The power of in-memory/HANA is in connecting all these dots.

    The Revolutioary Power of In-Memory/HANA 
    Our 6-years of experience has shown us that many enterprise applications work with databases in a similar way. They process large numbers of rows during their execution, but crucially, only a small number of columns in a table might be of interest in a particular query. The columnar storage model like used  in HANA allows only the required columns to be read while the rest of the table can be ignored. This is in contrast to the more traditional row-oriented model, where all columns of a table—even those that are not necessary for the result—must be accessed. The columnar storage model also means that the elements of a given column are stored together.
    This makes the common enterprise operation of aggregation much faster than in a row-oriented model where the data from a given column is stored in amongst the other data in the row.

    Scaling Out Due Parallelization Across Multiple Cores and Servers
    Single CPU cores are no longer getting any faster but the number of CPU cores is still expected to double every 18 months. This makes exploiting the parallel processing capabilities of massiv multi-core CPUs (A rack has up to 800 CPU-cores) of central importance to all future software development.
    As we saw above, in-memory columnar storage like SAP HANA places all the data from a given column together in memory making it easy to assign one or more cores to process a single column. This is called vertical fragmentation. Tables can also be split into sets of rows and distributed to different processors, in a process called horizontal fragmentation.
    This is particularly important as data volumes continue to grow and has been used with some success to achieve parallelism in data warehousing applications. Both these methods can be applied, not only across multiple cores in a single machine, but across multiple machines in a cluster or in a data center.

    Compression for Performance and to Save Space in Main Memory
    Data compression techniques exploit redundancy within data and knowledge about the data domain. Compression applies particularly well to columnar storage in an enterprise data management scenario, since all data within a column has the same data type and in many cases there are few distinct values, for example in columns like country, town, name,  or status. In column stores like SAP HANA, compression is used for two reasons: to save space and to increase performance. Efficient use of space is of particular importance to in-memory data management because, even though the cost of main memory has dropped considerably, it is still relatively expensive compared to disk. Due to the compression within the columns, the density of information in relation to the space consumed is increased. As a result more relevant information can be loaded for processing at a time thereby increasing performance. Fewer load actions are necessary in comparison to row storage, where even columns of no relevance to the query are loaded without being used.

    In Memory/HANA – Rethinking Application Development to Achieve in the Near Future a Real “Real-time Enterprise”
    In-memory/HANA is not only a technology but a different way of thinking about software development: we must take fundamental hardware factors into account, such as access times to main memory versus disk and the potential parallelism that can be achieved with multi-core CPUs. Taking this new world of hardware into account, we must write software that explicitly makes the best possible use of it. On the positive side for developers of new real-time enterprise applications, this lays the technological foundations for a database layer tailored specifically to all these issues. On the negative side, however, the database will not take care of all the issues on its own. Developers must understand the underlying layers of soft- and hardware to best take advantage of the potential for a real-time enterprise.

    Posted in