Home Tech News Webcast: This is how real time becomes reality

Webcast: This is how real time becomes reality

by Tejas Dhawan
Being able to use data in real time is becoming a competitive advantage.
Photo: Peshkova – shutterstock.com

The promise of real-time data usage needs the right technology. A webcast of the Computerwoche shows what SAP HANA, SAS Viya and KI are doing on IBM POWER. Scale up or scale out – both are possible.

Dilek Sezgün, AI, Big Data and Open Source Ecosystem and Infrastructure Leader at IBM, speaks about this with Dennis Eichkorn, Head of Alliances & Channels DACH at SAS Institute AG, and Ulrich Oymann, Business Development Manager, SAS on Power Systems IBM and Andreas Span Director & Business Unit Executive, SAP HANA on Power & Cognitive Sales at IBM. IT analyst Axel Oppermann from Avispador contributed a look at the market.

Specialist journalist Oliver Janzen from Computerwoche moderates the webcast and initially lets the analysts have their say. “The real-time issue is going through the roof,” observes Oppermann. The “huge enthusiasm” about the catchphrase of data as new oil is practically evident in the conversion processes that many companies are now in. “And this requires data to be available in real time,” emphasizes the analyst.

Nine out of ten decision-makers also agree with this statement – but let the myth that real-time data involve high costs and effort be slowed down. Oppermann advises a systematic approach with questions such as: Where do I want to go? How do I create agility? This can be seen in the definition of the right data and its sources as well as the technological basis and platforms. In summary, the topic can be broken down into a formula: AI (Artificial Intelligence) goes hand in hand with IA (Information Architecture). In practice, however, many companies do not have the internal know-how for this, Oppermann adds.

Before continuing the round of experts, the moderator wants to know how the situation is with the webcast viewers. Are you already using HANA and if so, where? 22 percent confirm the use of power, another 28 percent plan this. Six percent use HANA on x86.

Whatever the usage scenarios, Span (IBM) makes one thing clear: “We are currently facing unprecedented problems. To be able to solve them, we have to access data in real time.” The idea of ​​the Single Source of Truth harbors a danger: it can also be a “Single Source of Failure”. This justifies the need for a stable infrastructure. Span includes the following points: The platform must generate fast provisioning, the system must be secure and maximize the time available in order to gain quicker access to data.

If HANA, then on the IBM platform – Span argues this with resistance, availability and seamless scaling. “The cost structure can be adapted to the customer,” he emphasizes. IBM currently has 75 global references from retail, manufacturing, automotive and other industries. Cloud or on prem? Both are possible, as are hybrid models. The acquisition of Red Hat plays a role here.

Oymann’s expert topic is SAS on Power. “The question is not so much whether SAS is installed, but which modules,” he says. There has been a partnership between SAS and IBM for a total of 40 years. Oymann sees the traditional SAS as a backbone (9.4 and SAS grid). The SAS Viya Next Generation solution now available on the market includes an in-memory analytics framework including ML / DL and AI workload. It is open and cloud-enabled. Oymann mentions the advantages of faster knowledge acquisition, a full-stack solution and maximized reliability. If necessary, SAS can provide decision-makers with a new deployment guide.

Eichkorn (SAS Institute) gives a cutting-edge reference: The Robert Koch Institute (RKI) is now using real-time data in the Covid 19 crisis. Where are intensive care beds occupied, where are respirators available – these are the questions. “SAS Viya is specifically designed to address such new customer challenges,” he says.

And thus on the topic of artificial intelligence (AI). Before moderator Janzen moves to Sezgün (IBM), he asks the webcast viewers where they see the greatest challenges here. The result is clear: 41 percent mention the development of the models. In addition, 24 percent each describe learning the systems and the availability of sufficient data as difficulties.

“Our customers often don’t know how to start their journey to AI,” confirms Sezgün. IBM outlines the correct procedure based on a five-step model: The basis is a modern infrastructure for cutting-edge and data & AI workloads. The following topics build on this (from bottom to top): Collect the Data, Organize, Analysis and Infuse. The same applies here: the whole thing is possible on prem or in the cloud. If the latter is desired, the expert recommends “looking carefully at what data is allowed in the cloud”.

In terms of infrastructure demand for AI, IBM differentiates between the two areas of training and inference. “You need scalability in both areas,” said Sezgün. IBM products are best connected with Spectrum Scale Storage. “It is extremely important for a data scientist to work quickly,” she emphasizes. The right solution can reduce the time it takes to train the models from eight to three to four hours. For example, the IBM AI portfolio “everything you need for Enterprise AI, on Infrastructure and on any Cloud” includes some Watson products.

When using artificial intelligence, the webcast viewers have big plans: 65 percent want to set up and train the systems themselves, as a third survey by moderator Janzen shows. However the companies implement the whole thing – one thing is clear: “Infrastructure matters”.

Watch the webcast here

Related Posts

Leave a Comment