At this Apsara Conference, Alibaba revealed the numbers behind its AI prowess along with a host of new products and solutions in AI.
During this year’s Apsara Conference, Alibaba revealed the numbers for the adoption and processing capabilities of Alibaba’s AI Platform for the first time. From these figures, you can catch a glimpse of the true power of Alibaba’s AI Platform:
- Apsara AI Platform answers more than one trillion calls per day.
- Apsara AI Platform serves 1 billion global customers.
- Apsara AI Platform processes 1 billion images, 1.2 million hours of videos, 0.55 million hours of voice messages, and 500 billion natural language sentences per day.
Also at the conference, Alibaba, and more specifically Alibaba Cloud, showed how they were leaders in everything AI related-from straight down to the AI chips to a whole host of applications and from AI-empowered cloud services and platforms to algorithms and other industrial solutions.
An avid fan and contributor to Alibaba Cloud created the below illustration of Alibaba Cloud’s AI architecture. The illustration makes for a good summary of the AI offerings at Alibaba Cloud and where they appear in Alibaba Cloud’s overall AI architecture:
To get to where Alibaba is now, Alibaba Cloud had to make great strides in chip design, machine learning algorithms, and along with its service and platform offerings powered by AI.
Alibaba Cloud’s semiconductor subsidiary Pingtouge launched the AI inference chip Hanguang 800 back in September of this year. As releaved in the conference, the chip is so powerful that it has the equivalent computing capabilities of 10 GPUs, ranking first among chips in terms of performance and energy efficiency in the world.
Alibaba Cloud ranks as one of the top cloud service providers in the world and the number one provider in the Asia Pacific region. Alibaba Cloud offers the most diverse and largest-scale AI cluster offering of any cloud service provider in Asia, as a stern believe and prominent leader in the public cloud market with a vast selection when it comes to processing platforms, of which there are CPUs, GPUs, FPGAs, NPUs, super computing clusters, and, of course, the new third generation of X-Dragon Architecture. Naturally, the suite of services offered by Alibaba Cloud are well integrated with each other, working together to provide unparalleled support to industrial AI applications.
In particular when it comes to AI, Alibaba Cloud’s platforms are the Apsara AI Platform, Apsara Big Data Platform, and AIoT Platform. These platforms are helpful to developers as they can significantly lower the threshold for developers to work in the new and exciting field of AI. Adding to this, the Apsara AI Platform is the first commercial, cloud-based machine-learning platform in China to offers support for large-scale algorithms. This platform is able to process tens of billions of features and around 100 billion training samples.
When it comes to the research about AI algorithms, Alibaba Cloud has won over 40 firsts for papers on a variety of topics, including AI algorithms for natural language processing, speech recognition, and computer vision.
Alibaba Cloud’s AI offerings provide a host of services inside and outside of Alibaba Group.
Within Alibaba Group, the Ali Xiaomi (literally Ali honeybee) Chatbot serves more than 5 million users each day. And, when it comes to AI-power automation applications, Alibaba Cloud has invented heavily into smart speakers and autonomous driving. Alibaba’s AliGenie is currently the largest Chinese-language intelligent voice assistant in the world. And Alibaba has taken the plunge to make autonomous driving applications continue to evolve from individual AI-powered vehicles to vehicle-road coordination.
Alibaba Cloud is a front-runner in applying AI in several different industrial scenarios. For example, in as early as 2015, Alibaba Cloud worked together with ecosystem partners to actively promote the application of Internet technologies in traditional industries. Similarly, Alibaba Cloud’s AI technologies and applications, including those used in the solution ET City Brain, which have already enjoy massive application in a variety of industries, including transportation, public safety, and even medical services.
[Investing in AI the Smart Way — Lessons from 7 Tech Giants]
In this blog, we will discuss the “deep value” aspect of artificial intelligence, which every AI investor and enterprise should consider.
The advent of artificial intelligence (AI) triggered a research and development (R&D) boom in various industries around the world. Due to powerful computing capability and excellent smart system functionality, AI has become more critical to people’s everyday lives. In turn, this has led tech giants and venture capitalists to increase investment in artificial intelligence companies.
As AI becomes more popular around the world, CCID expects that the global AI industry market is likely to approach USD $39 billion in 2018 and USD $58 billion by 2020. In the eyes of many investors, artificial intelligence technology can make more informed decisions, allowing entrepreneurs and innovators to create products of even higher value for their customers.
However, judging by the history of the two waves of the artificial intelligence technology, as well as the decisions of global tech companies such as Alibaba and Apple, focusing on AI technology investment alone is not a wise decision. But why? Because the deeper value behind the technology is even more critical.
Data from CB Insights shows that the total amount of financing for artificial intelligence startups worldwide reached a record 15.2 billion USD in 2017. Internet companies have become more prominent in artificial intelligence investment. According to PitchBook data, the artificial intelligence and machine learning field received a total of more than 10.8 billion USD in venture capital in 2017. These figures indicate that the market is optimistic about the future of artificial intelligence. AI enterprises hope to empower AI technology, inevitably transforming artificial intelligence into a representative trend of our generation.
However, as far as the currently popular AI goes, we are already in its third wave of popularity. In March 2016, AlphaGo’s game of Go against Lee Sedol brought this technology to the forefront of the public consciousness, putting the words “artificial intelligence” on everybody’s lips.
As for the first two waves of popularity, the first happened in the 1950s, when Alan Turing proposed the symbolic Turing test. Milestone technologies and applications such as mathematical proof systems, knowledge reasoning systems, and expert systems had suddenly set off the first artificial intelligence craze among researchers.
The second wave came about in the 1980s, when technology based on statistical models quietly emerged. Not only did it see the progression of speech recognition and machine translation technologies, but artificial neural networks also found application in areas such as pattern recognition. More importantly, in 1997, the Deep Blue computer system defeated the human chess master Gary Kasparov, which brought public enthusiasm to a peak.
It was during this period that AI first received significant hype and rapid increases in investment. Due to the situation at that time, investors did not adequately consider the value behind these entrepreneurial ideas but merely raised funds for what they believed to be an exciting technology. This led to the failure of most of the first generation AI startups, and they eventually faded away. For example, artificial intelligence companies founded in the 1980s such as Symbolics, IntelliCorp, and Gensym have either wholly transformed or no longer exist.
Now, 40 years later, we are facing a very similar problem. Even though the technology today has become more complicated, there is still the undeniable fact that AI has yet not created sufficient value for consumers. This is why Xiao Zhijun believes that investing in AI or “deep technology” is not a wise decision. To the contrary, it is the outlook of Xiao Zhijun that the deeper value behind AI, rather than the technology itself, should be the target of investment.
Neural synapses in mammal brains are also capable of performing deep learning. New advances in AI and neuroscience aim to emulate, or even surpass, th.
Neural networks are inspired by the biological neurological network system. But do our brains learn the same way as a computer does deep learning? The answer to this question can possibly bring us to a more powerful deep learning model, and, on the other hand, also help us to better understand human intelligence.
Do Biological Systems Use Deep Learning?
On December 5th, Blake A. Richards — a CIFAR researcher from the University of Toronto — and his colleagues published an article on eLife titled Towards deep learning with segregated dendrites. They described an algorithm with which to model deep learning in human brains. The network they built indicates that the neural synapses of certain mammals have the correct shapes and electric characteristics to make them suitable for deep learning.
Not only that, their experiments showed how the brain works with deep learning in a method that’s surprisingly close to the natural biological model. This is hopeful in helping us further understand how we evolved our learning ability. If the connection between neurons and deep learning is confirmed, we can develop better brain/computer interfaces and we will likely gain a number of new abilities ranging from treating various kinds of disease to augmented intelligence. The possibilities seem endless.
Evidence of Deep Learning on Neurons in a Mammalian Brain
This research is done by Richards and his graduate student Jordan Guerguiev, along with DeepMind’s Timothy Lillicrap. The neurons used in the experiment are the cortex dendritic cells of a mouse brain. The cortex is responsible for some high order functions such as sense, movement, spatial logic, consciousness and human languages. The dendritic cells are the bumps derived from the neurons. Under the microscope, they look somewhat like a tree bark. Dendritic cells are the input channel of neurons and they deliver electrical signals from other neurons to the main body of a neuron cell.
Using the knowledge of this neuronal structure, Richards and Guerguiev built a model called the “multi-compartment neural network model.” In this network, neurons receive signals in separate “cubicles”. Due to its piecemeal nature, different layers of simulated neurons can cooperate to achieve deep learning.
The results indicate that when recognizing hand-written numbers, a multi-layer network is significantly better than a single-layer network.
Algorithms that use multi-layer network structures to identify higher order representations are at the core of deep learning. This suggests that mouse brain neurons are able to do deep learning just like artificial neurons. “It’s just a set of simulations, so it does not accurately reflect what the brain is doing, but if the brain can use the algorithms which AI are using, that’s enough to prove that we can conduct further experiments,” said Richards.
Using Better Training Methods for Deep Learning
In the early 2000s, Richards and Lillicrap took Hinton’s classes at the University of Toronto. They were convinced that the deep learning model to some extent accurately reflects the mechanics of the human brain. However, there were several challenges to validating this idea at the time. First of all, it is not yet certain whether deep learning can reach the level of complexity in a human brain. Second, deep learning algorithms typically violate biological facts, which have already been demonstrated by neuroscientists.
The patterns of activity that occur in deep learning by computer networks are similar to the patterns seen in the human brain. However, some of the features of deep learning seem to be incompatible with the way the human brain works. Moreover, neurons in artificial networks are much simpler than biological neurons.
Now, Richards and some researchers are actively seeking ways to bridge the gap between neuroscience and artificial intelligence. This article builds upon Yoshua Bengio team’s study on how to train neural networks in a more biologically viable way.
Alibaba Cloud heterogeneous platform for elastic computing aims to provide high-quality services for organizations to realize scientific and technological innovations.
The heterogeneous computing technology is evolving rapidly in the field of big data and AI in recent years. To cope with this technology revolution, Alibaba Cloud Heterogeneous Computing has made great achievements in both product type and application.
Pan Yue, a senior product expert of Alibaba Cloud, made an in-depth presentation on Alibaba Cloud Heterogeneous Computing in the era of big data and AI at the Computing Conference Shenzhen summit on the morning of March 29, 2018.
Pan Yue introduced that Alibaba Cloud provides a wide range of “heterogeneous acceleration platforms for multiple scenarios”, including GA1 instance (AMD S7150) for graphic and image rendering, GN5 (Tesla P100) instance for AI training and reasoning, GN5i (Tesla P4) instance for AI reasoning and video transcoding applications, and GN6 (Tesla V100) instance for advanced computational capabilities in the fields of AI and high-performance computing, and multiple FPGA instances for image transcoding, genetic computing, and database acceleration.
GN6 instance is specifically designed for deep learning training and high-performance computing. With Tesla V100 in the latest NVIDIA Volta architecture, GN6 instance delivers 12X higher computing performance than Tesla P100 (the previous generation), solving the outstanding problems for engineers and experts. GN6 (Tesla V100) is now available for public beta and will be coming soon.
Alibaba Cloud has been and will be always professional and precise towards issues related to heterogeneous products. Most people may choose Pascal-Based Tesla P40 as the GPU because of its higher single-precision floating-point capability, which is 12 teraFLOPS (TFLOPS) while only 10.6 TFLOPS for Tesla P100, according to the product manuals. However, Alibaba Cloud performed tests in different scenarios and methods and found that the performance of Tesla P100 is 20% higher in AI application. Here are the test results:
This article describes how to use the Online Predictive Deployment feature of Alibaba Cloud’s Machine Learning Platform for AI (PAI) to monitor user health in real time.
Models generated on the Alibaba Cloud Machine Learning Platform for AI (PAI) can be deployed online to generate APIs that can be invoked by other services. This document is based on the Heart Disease Prediction Case, and describes how to use the Online Predictive Deployment feature of the machine learning platform to monitor user health in real time.
Step 1: Model Deployment
Click Deploy in the lower section of the current experiment interface and select Online Predictive Deployment. Select the logistic regression model generated in the heart disease prediction case, as shown in the following screenshot.
Step 2: Model Deployment Information Configuration
Go to the model configuration page, as shown in the following screenshot.
Select a corresponding project. If you are using this for the first time, you need to enable the online prediction permission, which will be enabled in real time upon request. Set the number of instances occupied by the current model. Instances are described as follows.
IoT, Big Data and AI are three of the most popular terms of recent times. In this course, we will not only introduce how these technologies are linked in Alibaba Cloud, but also how they have paved the way for the technological progress we have come to expect to help us to win the second half of the internet era.
In this course, you will have an overview of Big Data and AI Productes to empower your business in an easy way.
This course will introduce Alibaba Cloud’s ultra-intelligent AI Platform for solving complex business and social problems. Powered by new advanced technologies, Alibaba AI technology is powering global breakthroughs in artificial intelligence and machine learning.
Related Market Products
This session introduces how to use Alibaba Cloud Machine Learning Platform For AI to create a heart disease prediction model based on the data collected from heart disease patients.
How to use Alibaba Cloud advanced machine learning platform for AI (PAI) to quickly apply the linear regression model in machine learning to properly solve business-related prediction problems.
How do I upload data?
To upload data on the Machine Learning Platform for AI web interface, make sure that the data is less than 20 MB. To upload data that is greater than 20 MB, you must download the MaxCompute client and then use the tunnel command.
How do I set the algorithm parameters?
To set the algorithm parameters, drag an algorithm component to the canvas and click the component. The corresponding parameters are displayed in the right-side pane.
How do I view experiment results?
If Machine Learning Platform for AI has successfully run a component, it marks the component with a green check. You can right-click a component with a green check to view data or evaluation results.
How do I view and download the model generated from an experiment?
To generate a model, you must first select Setting > General > Auto-generate PMML from the left-side navigation pane. After successfully running an experiment, you can select Model from the left-side navigation pane to check the corresponding model. To view the model parameters, right-click the model. To download a model, right-click the model and select Download PMML.
What is PMML?
PMML is a standard model description file. A PMML file downloaded from Machine Learning Platform for AI can be applied to open-source engines, such as Spark.
The following figure shows the architecture of the Machine Learning Platform for AI.
The bottom layer is the infrastructure layer that consists of CPU and GPU clusters.
The second layer from the bottom is the Alibaba computing framework, which includes MapReduce, SQL, MPI, and other computing methods.
The layer in the middle is the model algorithm layer, which includes the data preprocessing, feature engineering, and machine learning algorithm and other basic components to help users complete certain fundamental jobs.
The top layer is the application layer. The data mining of Alibaba internal search, recommendation, Ant Financial and other projects depends on the Machine Learning Platform for AI.
Intelligent Speech Interaction is developed based on state-of-the-art technologies such as speech recognition, speech synthesis, and natural language understanding. Enterprises can integrate Intelligent Speech Interaction into their products to enable them to listen, understand, and converse with users, providing users with an immersive human-computer interaction experience.
Relying on Alibaba’s leading natural language processing and deep learning technology, based on massive e-commerce data, we provide customized high-quality machine translation services for Alibaba Cloud users.