Artificial Intelligence (AI) and the Internet of Things (IoT) are in a process of merging into what we define as AIoT. With edge computing, computational power is moving to the edge where IoT devices are gathering data. AI is the next logical step in efficient data handling and lowering latency while opening up for innovative solutions at the edge.
Conditions on the edge where this AI computation takes place are varied and any device has to account for this. This paper explains these trends in light of the need for optimized storage and memory solutions for these AI edge applications.