BRIAN SANTO: I’m Brian Santo, EE Times Editor in Chief, and you're listening to EETimes on Air. This is your Briefing for the week ending June 21st.
我是EE Times主编Brian Santo,你正在收听的是EETimes全球联播。这是截至6月21日的一周热点汇报。
Our lineup this week includes:
我们本周的联播包括:
A guided tour through London’s Tech Week, an annual extravaganza of new technologies. Unsurprisingly, this year there was an emphasis on artificial intelligence,
带你参观伦敦科技周,这是新技术的年度盛会。不出所料,今年的重头戏是人工智能;
We’ll have a report on the race to build the fastest supercomputers,
接下来我们将向你报告构建最快超算的竞赛情况;
…And … you know those GPS apps you use for driving? Self-driving vehicles use maps, too, but they need maps that are far more accurate. We’ll hear about that in a moment.
然后,你知道开车使用的GPS app吗?自动驾驶车辆也需要地图,但它们所使用的地图要精确得多。我们马上就会听到这个消息。
First up, EE Times editor Sally Ward-Foxton attended several events during London’s Tech Week. The UK is bidding to become a major hub for AI technology, but the same idea has occurred to other countries as well.
首先,EE Times编辑Sally Ward-Foxton在伦敦科技周期间参加了几项活动。英国正在竭力争取成为AI技术的中枢,但其他国家也有同样的雄心。
And a quick translation of English to English for you. Glastonbury is a music festival not dissimilar to the New Orleans Jazz Festival, where savvy festival veterans know to show up in knee high rubber boots because enormous mud puddles are not uncommon.
为你做个快速的英语-英语翻译。Glastonbury是一个与新奥尔良爵士音乐节有不少类似之处的音乐节,精明的音乐节老手都会身穿过膝的长靴来参加,因为会场随处可见泥浆水坑。
Here’s Sally.
有请莎莉。
以下为英文原稿,Enjoy!
SALLY WARD: Last week I attended several events as part of London Tech Week, a series of conferences and exhibitions all about emerging technology, with a particular focus on AI.
London is positioning itself as an innovation hub for AI technology. The government’s figures say there are at least three times as many AI startups in London as any other city in Europe, and it’s the UK’s fastest growing sector. Prime Minister Theresa May opened the event by pledging millions of pounds of government funding to support the development of AI and related technologies such as quantum computing.
But London isn’t the only city with its eyes on the AI prize.
Alexandra Dublanche, a representative from the Paris Regional Government, introduced the city’s AI 2020 plan, which aims to support SMEs who want to get into AI with a range of programs designed to foster innovation. This includes AI technology challenges set by the Paris Region with several million Euros in prizes.
Aside from London and Paris, there are many other regions battling to be the home of AI technology.
Another presentation at the AI Summit was from Foteini Agrafioti, Chief Science Officer for the Royal Bank of Canada. She’s also the head of Borealis AI, the bank’s research institute for AI technology. She made a compelling case positioning Canada as the natural home of AI, given the country’s academic prowess in the subject; it was of course scientists from the University of Toronto that famously won the ImageNet contest in 2012. Today, Canada has more than 600 researchers and 60 faculty members in universities working on cutting edge AI research, she said.
The event’s exhibition also hosted big pavilions from countries such as Romania, Ukraine, and South Africa, keen to show off their country’s burgeoning AI offerings.
I also attended the CogX event (short for CognitionX), which is billed as “The Festival of AI and Emerging Technology.” It definitely had shades of Glastonbury on Monday, with several stages in large tents on the lawn, as the rain poured and the area became a quagmire.
While the CogX program covers everything from research to ethics, the presentations on the future of AI hardware were of particular interest.
Analyst James Wang from ARK Invest presented a detailed overview of the AI chip startup landscape. He described it as “The AI Chip Hunger Games,” with dozens of startups, plus almost all the processor incumbents, desperately trying to become the next ARM, the next Intel, or the next Nvidia. He did say that not all the contestants will make it through the coming years – especially since half the startups are still only shipping powerpoints.
James Wang also noted that there are at least half a dozen startups pursuing optical computing for AI – using lasers to do fast matrix multiplication – which sounds interesting, so we’ll have to keep an eye on that technology.
Graphcore CEO Nigel Toon presented a compelling 20-minute monologue on how today’s hardware is holding back the development of artificial intelligence. We’ll need to improve hardware performance by at least a factor of 100, he said. Graphcore’s IPU chip is set to address this. It’s the most complex processor ever built, with 24 billion transistors and more than 2,000 processor cores.
Outside of digital computing, there were a couple of interesting presentations about alternative technologies.
Mike Henry, co-founder of American startup Mythic, presented the company’s analog computing technology, which stores 8 bits in a single Flash memory transistor, then uses a Flash memory array as an efficient matrix-multiply engine. In this case, there’s no bottleneck between the processor and the memory, because the processor IS the memory. It’s relatively cheap to build, as the technology uses the standard 40nm Flash process.
Mythic is pitching its technology squarely at AI inference chips in edge devices, where Henry said there are plenty of niches for 60+ inference chip start-ups to play in.
We also heard from the CEO of Oxford Quantum Circuits, Ilana Wisby.
Her company is pursuing quantum computing using superconducting metals cooled down to just above absolute zero – 10 milliKelvin – and that’s the easy part. Building repeatable, reliable Qbits is still rather difficult; and to build something useful, you’d need an array of at least 50 high-quality Qbits that can be addressed and manipulated on demand, she said. The potential of this technology is massive, but it’s still quite a way off. Oxford is working on building a quantum device for early stage applications in 5 to 7 years.
This is Sally Ward-Foxton reporting from London Tech Week for EETimes.
BRIAN SANTO: Rick Merritt is based in Silicon Valley, but he was monitoring the International Supercomputer Conference held this week in Frankfurt, Germany. There’s a perpetual international competition to build the fastest supercomputer.
So, Rick, any news from Frankfurt on which country gets supercomputer bragging rights?
RICK MERRITT: There were no major change in the rankings of the top 500 supercomputers that came out recently. But this is really the quiet before the storm because Intel, AMD, IBM, Nvidia and Cray are all working in various collaborations on three major exascale projects in the US, and China has three exascale projects of its own in the works. So by 2021 we're going to see a lot of shake up on the list, and it will be quite dramatic.
BRIAN SANTO: Okay. There are different ways to build a supercomputer, and different tests to measure supercomputer performance, and now the benchmarking is becoming a little controversial, right?
RICK MERRITT: Actually, the goal posts are moving, because exascale and the petaflops systems, all the systems on the list today are measured on the Linpack benchmark, which most agree is not a really good measure of real-world performance. So there’s a new benchmark on the list, too, called the High Performance Conjugate Gradient (or HPCG). And using that, most of the top US, European and Japanese systems still rank quite high. But interestingly, the China systems drop down quite a bit when you use that benchmark.
BRIAN SANTO: Uh-hm. So who is currently in the lead?
RICK MERRITT: Overall, China still really leads in the number of top supercomputers, with 219 systems on the list versus just 116 for the US. But the US does better in terms of the total performance on the list with 38% versus just under 30% for China.
BRIAN SANTO: Alright. So that’s the system-level view. Anything new when it comes to ICs for supercomputers?
RICK MERRITT: One interesting sidelight at the event was that Nvidia announced it's supporting Arm or releasing open source software to support Arm processors next to its own accelerators. And Nvidia by far is very popular in supercomputers as an accelerator provider with its GPUs. It's used in 112 of the 133 systems that have some kind of accelerator. And Nvidia is already supporting Intel and IBM Power processors, so it was no surprise they are going to start supporting Arm as well, especially since the European Union has an exascale project that's using Arm processors, and they certainly would like to be the accelerator for that.
BRIAN SANTO: Thanks, Rick.
更多...
——听全球专业科技资讯,只在EETimes
用户评论