IBM AI Infrastructure on the Surface
Scrounging for material on artificial intelligence from the company with the most machine learning patents in the world
IBM is the company with the most machine learning patents in the world. It is generally considered a leader when it comes to patents in general for quite some time. That alone should perhaps garner some attention in terms of how IBM structures their approach to artificial intelligence.
The IBM AI Infrastructure offering is marketed as three aspects.
- Spectrum storage.
- Spectrum computing.
- Enterprise AI servers.
A lot of talk about Spectrum, but what is it according to IBM?
“IBM Spectrum Computing uses intelligent workload and policy-driven resource management to optimize computing clusters across the data center, on premises and in the cloud. Increase ROI and reduce total cost of ownership with a complete system and enterprise support from IBM.”
I looked into a product from IBM called IBM Spectrum Scale: “…Advanced storage management of unstructured data for cloud, big data, analytics, objects and more.”
In terms of storage IBM has a clear goal.
“The two most powerful computers on the planet — Summit and Sierra — are purpose built for AI, and both depend on IBM Storage. Spectrum Storage for AI featuring IBM Spectrum Scale provides the solid foundation and shared data service on which to build efficient, scalable data pipelines for AI and big data. Now it’s easier than ever to deploy Spectrum Scale for AI and big data workloads and machine and deep learning.”
The report from IBM is about moving data from ‘ingest to insight’.
In a presentation from March in 2019 this is what it looks like:
It looks complicated, there is a lot in one picture. I cannot argue for its accuracy today one year on, or in all cases.
However it presents an interesting picture.
In this they have a “housekeeping checklist” for data. It provides a neat overview of a few tips and tricks for practical approaches to data.
This checklist will help you do that, with tips on how to:
- “Gather existing data already in your organization
- Develop algorithm models to analyze your data sets
- Apply those models to new and incoming data”
In a different reference architecture they present a rather more easily understand picture.
They have outlined an AI workflow that makes sense.
IBM describes how it uses ‘policy-driven resource management’ to help — spectrum computing.
IBM Spectrum Computing accelerates and simplifies AI, data analytics, and HPC.
IBM truly has a lot of written material on processes on how to deal with different processes relating to artificial intelligence. Their three step process and circle (data, train, inference) might more easily explain how to approach for people less familiar with machine learning projects.
IBM is definitely one to follow when it comes to AI, and although there is rather a lot of marketing jargon in their material it can be worth a read in terms of clarification of data process or models on how to create decent workflows that are easy to understand.
This is #500daysofAI and you are reading article 334. I am writing one new article about or related to artificial intelligence every day for 500 days. My focus for day 300–400 is about AI, hardware and the climate crisis.