3 Ways Blockchain and DTL Can Help Artificial Intelligence
Artificial Intelligence, which basically means the development and application of machines that can "learn" from data and experience, adjust to new inputs and perform human-like tasks; has very varied applications.
For instance, AI-based algorithms are now being applied to offer high-tech customer support where AI-enabled "customer assistants" can answer simple questions from customers. AI-based algorithms are also being applied in the personalization of customer shopping experiences for instance by making purchase recommendations to customers based on their previous shopping and inputs and search terms. Artificial intelligence is being applied in search engine algorithms, emails, movie platforms, social media feeds and suggestions, etc. It is being applied in healthcare to aid analysis of samples and to improve workflows.
AI is also being employed in aiding travel and navigation through the application of algorithms that can suggest routes and help with travel plan suggestions, flight suggestions, trip suggestions, and booking, for travelers and tourists. AI-based algorithms fed with information such as newspaper headlines, conversations, and speeches, insights are able to help artists create new themes, etc. AI-based algorithms in camera systems and drones and machines are being used in security and surveillance applications where they are fed with or made to collect data from multiple feeds to identify possible risks based on that incoming information.
In finance, AI-based algorithms have vast applications: they are enabling collection and analysis of large quantitative data used to make financial decisions for instance by availing analytics to investor customers inside of financial investment platforms; real-time reporting of financial information; and improving accuracy and processing speed in analyzing large volumes of financial data. The financial sector is well known in implementing machine learning, algorithmic trading, adaptive intelligence, chatbots, financial advisory through automated advisors, automation, generation of actionable reports from financial data, etc into an array of processes. And these have altogether helped save costs by reducing numerous hours of analyst work and improving decision-making.
Smart cars and drone manufacturing and monitoring or what has been termed as autonomous driving are largely being tested and AI is a crucial component of it. Smart homes are and will be also largely dependent on AI-enabled algorithms and devices running these algorithms are made to "learn" the user's experiences and to personalize or suggest options for the user based on their experiences.
Of course, artificial intelligence is never perfect: AI-based machines cannot learn on their own without pre-fed data that directs the programs to act in a given manner if a pre-set condition is identified based on incoming data when compared to previously fed models. They are only great in easing repetitive tasks and helping to "re-do" repetitive tasks because only in regard to previously known models can convenience, speed, accuracy, assurance be assured and optimized. Thus the word "machine learning." Some more weaknesses of AI: currently, there is not much advances in AI in a lot of areas that AI could finally help the world become a better place. One of the greatest impediment of AI application is that it requires a vast amount of computing resources and alongside this, storage and data transfer costs can be prohibitive with centralized systems. We are only at the starting point and yet challenges storage costs, limited computational power, data, and information security problems, and hacking is impeding AI applications.
What is/could limit AI application? the challenges
AI computing demands and related problems will be made worse by a number of challenges. The first of those is the cost of data transfer and storage of the high volume of training data and associated bandwidth costs for transfer. The desire for having data to be stored close to the processors to achieve speedy exascale processing of data in centralized super-computers will require a redesign of computer memory and of underlying I/O (input/output) operations and significant investments in network capacity. Other costs include the OPEX and environmental costs particularly in terms of power consumption and generating CO2 emissions.
Secondly, physical limitations are likely to impede the manufacturing of computing hardware where the ability to infinitely shrink microprocessors based on Moore's law, will soon become limited.
Increasing data regulation will also continue making centralized data placement more difficult and undesirable and challenge the purpose of building super-computers. A good example is GDPR which was recently adopted in Europe.
Another challenge is the lack of a computing-supportive ecosystem where for instance the launch of new computing hardware is on the decline and the fact that emergence of high-level programming languages, APIs and libraries is contributing to a less nuanced understanding of computing architectures and basic computing operations.
Using blockchain to realize AI's full potential
Increasing computing power
Machine learning and deep learning (an advanced type of machine learning which involves multilayering of neural networks) are both data- and computing-heavy. For instance, a basic deep neural network classifying animal pictures into dogs and cats will need hundreds of thousands of classified animal pictures in training data and billions of iterative computations in order to mimic a four year’s old ability to discern cats from dogs.
Scalable advances in computation power available to machine learning tasks will need to be accelerated if the vast amount of hardware and computational power required is to match the growth of AI application. Most of AI applications currently rely on a traditional CPU based computing (data centers) to perform machine learning tasks. Since a typical CPU unit has between 6-14 cores and can run between 12-28 different threads of command typically on a single block, a vast amount of these CPU data centers will be needed to meet the growing demand of AI. GPUs are also being used, and these can hold between 2000 and 3000 cores and can run 100 or more threads of command with each thread typically on around 30 blocks at the same time hence increasing speed and consuming less energy.
Blockchain or DLT can be used to provide the computational resources needed by AI through utilizing the computing power of machines that hold non-utilized GPU computing power. For instance, in a Bitcoin protocol, many miners are able to solve complex mathematical problems that a single miner cannot solve. Of course, this can be done by tokenizing computing power in the manner in which the value of property and services are being tokenized. Currently, it is possible on some blockchain networks for people on the network to share resources such as storage and computational power to do different tasks.
Decreasing computing costs
The demand for AI computation is doubling with costs increasing proportionately after every 3.5 months. Price is also being used as a lever to control usage which restricts innovation. The cost of GPU computing time on multiple cloud platforms is around $0.5/hour.With a centralized system, the costs can grow prohibitive because of the need to invest in huge centralized servers and technology as demand for AI grows.
Today, distributed computing systems and marketplaces for GPU computing power, which entail blockchains that connect members in a network and these members can contribute or share computational power to complete given tasks including machine learning tasks. On these platforms, the cost of GPU computing is around $0.01- 0.05/hour.
Improved data integrity
Blockchain makes data private, immutable, transparent, distributed besides the fact that it is free to operate without the direction of a sovereign entity. Public blockchains might soon become important feeds to AI not only with a motive of sharing computational power but will also, this way, serve to preserve the validity of the models. They can add structure and accountability to AI algorithms and quality and use of the intelligence they produce. Blockchain can help by preventing the altering of AI data, and a distributed ledger can ensure the same data is shared across different platforms.