But the human baby does not learn to recognize a cat this way. The babies, they go out, see a cat or two, and then they know what a cat looks like. They don’t need a million training samples, they only need one or two. So that’s very different. So they don’t need big data to learn. Instead, small data is sufficient. So this is one of the big challenges that needs to be addressed for AI to reach the next level. So one possible route is to move from today’s data-driven AI to future hybrid AI where it integrates both the data-driven and knowledge-driven together. So that’s my thoughts about AI’s future.

Laurel: No, that’s amazing. It’s also really interesting to hear about AI in production in manufacturing and then this idea of learning AI, and like you said, a baby doesn’t learn what a cat is. And there’s clearly so much in between each of those ideas. I think particularly this concept is so complex and when we think about how AI is now changing and kind of evolving into this thing, we’re now calling the industrial metaverse, right? Which is a blend of those virtual and real-world capabilities. This isn’t science fiction anymore, this is actually happening. So what are some of those examples of extended reality or XR that we might see in those next few years?

Yong: That’s correct, Laurel. Actually, I think the reality is here, and with that said, the metaverse itself is still in its early stage of development. And different people have different definitions of what the metaverse is. From Lenovo’s perspective, the metaverse is a hybrid of physical and virtual worlds where people and objects connect and interact with each other. And the XR devices, including AR, VR, MR are the key human machine interface of and the portal to, the metaverse, with the combined and reinforced information from both words, the physical and the virtual. In the metaverse, we can provide users with more immersive and interactive experiences and solve industry challenges with higher efficiency and lower cost. So let me show you an example. So let’s take the electric power industry, for example. Historically, the inspection of power station equipment has been time-consuming and sometimes, as you can imagine, dangerous. Besides, human workers can make mistakes causing power outages and other accidents.

As such, these tasks can incur high cost for power companies. But now with the new metaverse technologies, we have the possibility to transform this industry into a safer and more efficient industry. The key is to build a metaverse that connects the virtual and the physical. And we actually thought about this a lot, and we concluded there are three ways to achieve this. The three ways are physical virtual mapping, physical virtual superimposition, and physical virtual interactivity. Again, let me use the electric power industry example to illustrate this. First, the physical virtual mapping means that we need to build a virtual version of the physical power station, which we refer to as the “meta-space.” Actually, two months ago we just finished our Tech World [conference]. I have a pretty detailed description about this meta-space. I probably won’t have time to go over that today with you, but for those in the audience who have interest, I would refer them to the Lenovo Tech World 2022 that happened in October.

They can have a more detailed scenario there. And then after this physical virtual mapping then comes the physical virtual superimposition, which means we overlay the digital information onto real objects through, for instance, AR glasses. This actually will significantly augment the capabilities of human workers, allowing them to check the status and identify more functions faster and perform maintenance tasks more efficiently. And thirdly, the human workers are not able to cover every corner of the power station, especially those hazardous areas that poses risks to health and life.

In that case, human workers can send a physical robot to do the job in their place, they can plan a path for the robot in the virtual power station. Then the robot can move in the physical power station and perform the inspection task, including recognizing equipment readings, detecting abnormal heat and monitoring equipment status in the power station. And the third way, again, we call it the physical virtual interactivity. So those are the three ways we think that we connect the virtual and physical world. And of course, above, I used the power station inspection as an example to illustrate the metaverse, but these technologies really can create huge opportunities across many other industries.

Laurel: You can really imagine that example in healthcare just being absolutely industry changing, if you were able to. Yeah, that’s really quite astounding. And I think it’s a good distinction to really define for folks what the industrial metaverse could bring us with this ability, with the XR technologies, to do things that haven’t been done before with that nice blend between what is virtual and then that physical world. So speaking of that physical world, how will adoption of technologies like these that we’ve been talking about today help sustainability and enterprise social and governance or ESG goals? And what are Lenovo’s own sustainability goals? Because as you mentioned, those enormous factories creating a number of laptops, one out of every eight in the world, that’s quite a challenge for Lenovo as well.

Yong: Laurel, thank you for asking this question. I think for any technology innovation, we should be responsible too. And you just mentioned the huge factory LCFC, right? Imagine if we can save 5% of electricity power, that’s going to save a lot of carbon emission. That can reduce a lot of carbon emission. That’s definitely something we are very committed to and we are very seriously working on. So overall, Lenovo is committed to achieve a sustainable growth by helping decarbonize the global economy.



Source link

By admin