Artificial Intelligence, Hardware and the Climate Crisis
Focusing on the climate crisis during the massive spread of the Coronavirus is a counterpoint — harmonically interdependent yet independent in rhythm and contour. The decrease of travel resulting from the lockdowns has in the short-term benefitted the global emissions ensuring that emissions from transportation has drastically decreased, especially from flights. At the same time many are realising that they can keep social relations or business going, at least to some degree, online — keeping the conversation and trade going.
Yes there has been a drop in sales for many companies such as the traditional retailers that are unwilling to adapt and to use digital technology. Additionally many has had to shut down – there is different capacities to enable digital, and there is a digital divide in the world that must be recognised.
Still, AI hardware is being made and distributed.
Sciforce has written a post about this called AI Hardware and the Battle for More Computational Power. In this article they described a series of different developments that enable machine learning based solutions or generally applications within the field of artificial intelligence to work in a more efficient way.
What they name are several aspects.
AI accelerators — a class of microprocessors, or microchips, designed to enable faster processing of AI applications, especially in machine learning, neural networks and computer vision.
- “The idea behind AI accelerators is that a large part of AI tasks can be massively parallel. With a general purpose GPU (GPGPU), for example, a graphics card can be used in massively parallel computing implementations, where they deliver up to 10 times the performance of CPUs.
- The second pillar of AI accelerators design is focused on multicore implementation. Think of a GPU that can accelerate such tasks using many simple cores that are normally used to deliver pixels to a screen.”
However there may be more down-to-earth applications of these combinations.
Considering the most successful applications (questionable at times of course).
Such as a camera.
This integration of software and hardware in a fluent and clever way may be what defines these kind of products within the field of artificial intelligence. Gathering better data does not mean just attempting your best to get up to scratch what already does not work, rather the possibility lies in gathering better data with solutions optimised for this purpose.
Here is a list with the features section in the Spec Sheet of Huddly IQ.
Hardware is not immediately what comes to mind when we talk about the digital, however it is in the end what makes the digital work. The computers, wires, satellites, phones, cameras, servers and so on is where the solutions within the field of artificial intelligence will be running. Code needs to be understand in the context where it runs, generally on some sort of physical device placed in a given setting.
In any kind of application we have to consider the situated use of hardware and where the commodities come from included or integrated in the given solution.
There are headlines like:
Indeed excitement can be found in this — why is it significant?
“On-device computing solutions startup Perceive emerged from stealth today with its first product: the Ergo edge processor for AI inference. CEO Steve Teig claims the chip, which is designed for consumer devices like security cameras, connected appliances, and mobile phones, delivers “breakthrough” accuracy and performance in its class.”
Together with Arlo Technologies focused on home automation they intend to protect privacy.
“By eliminating the need to send data to the cloud for analysis, Ergo could bolster battery life while providing peace of mind to homeowners and businesses about privacy.”
Well, so it is important because these companies are focused on technology that could become integrated solutions of your everyday life.
The companies are ones that are not talked about to some great degree.
The article in VentureBeat mention several other companies.
“Startups Hailo, AIStorm, Esperanto Technologies, Quadric, Graphcore, Xnor, and Flex Logix are developing chips customized for AI workloads — and they’re far from the only ones. The machine learning chip segment was valued at $6.6 billion in 2018, according to Allied Market Research, and it is projected to reach $91.1 billion by 2025.”
I had a look at the companies mentioned and had a look into some basic background information.
- Hailo is developing a breakthrough specialized deep learning processor, that empowers intelligent devices with the performance of a data center-class computer, operating in real time at minimal power consumption, size and cost.
- AIStorm make sensors that convert sensor information into a digital representation. AIS. AI-in-Sensor. “Team of experienced semiconductor executives, backed by leading sensor and equipment manufacturers, aims to equip the next generation of handsets, IoT devices, wearables, and vehicles with a new approach to AI processing at the edge; compared with edge GPU solutions, AIStorm’s technology boosts performance while lowering power requirements and system cost” 11th of February in BusinessWire.
- Esperanto Technologies is at the leading edge of technology putting thousands of RISC-V cores on a single chip. Esperanto delivers high-performance, energy-efficient computing solutions, the compelling choice for the most demanding AI / ML / DL applications.
- Quadric is building the only end-to-end architecture optimized for realtime edge computing.
- Graphcore has built a new type of processor for machine intelligence to accelerate machine learning and AI applications for a world of intelligent machines.
- Xnor.ai. Edge computing company Xnor.ai has made a device capable of deploying a state-of-the-art deep learning algorithm with a small solar panel — without requiring a battery or other power source.
- Flex Logic Technologies. Based on patented interconnect technology from its embedded field-programmable gate array (eFPGA) chip designs. It combines the eFPGA tech with inference-optimized nnMAX clusters in the new combination chip.
Briefly examining these companies I found a whitepaper by Quadric that had a few images that may explain part of the thoughts surrounding this technology dealing with perception and prediction.
In more detailed visualised as such:
The article in VentureBeat called: “Perceive emerges from stealth with Ergo edge AI chip” refers to the market size:
The machine learning chip segment was valued at $6.6 billion in 2018, according to Allied Market Research, and it is projected to reach $91.1 billion by 2025.
In comparison the cloud market is projected to grow far beyond:
The global cloud computing market is expected to reach $623.3 billion by 2023.
Every cloud solution is however made up of several small components that has to be bought and delivered.
Much of this digital infrastructure has been strained during the spread of the Coronavirus, and all hardware has an effect on the environment in terms of emissions both within operations as well as materials.
Regardless it will be interesting to consider these topics in relation to each other.
This is #500daysofAI and you are reading article 303. I am writing one new article about or related to artificial intelligence every day for 500 days.