06 |谷歌I/O大会、AI公平性和科技界女性、爱迪生奖

2022-09-11 23:38:0723:30 147
声音简介



JUNKO YOSHIDA: This is Junko Yoshida, EETimes Chief International Correspondent. You're listening to EETimes on Air.
JUNKO YOSHIDA:我是EETimes首席国际通讯记者吉田顺子,您正在收听的是EETimes全球联播。

DAVID FINCH: This is your EE Times Weekly Briefing. Today is Friday, May 10th, and among the top stories this week:
DAVID FINCH: 又到了EETimes全球联播时间,今天是5月10号,星期五。

本周新闻热点包括:

*Google I/O, Google's developers’ conference. CEO Sundar Pichai touted Google’s awakening to privacy for its users’ data.
* Google I/O,即谷歌开发者大会。CEO Sundar Pichai吹嘘称谷歌很懂用户数据的隐私;

*This week, EETimes launched a new Special Project package on Artificial Intelligence, with a particular focus on AI fairness. We ask and answer the question: “Will Machines Ever Learn to Be Fair?”  
* 本周,EETimes推出了一个关于人工智能的专题项目,特别关注AI的公平性。我们提出并回答这个问题:“机器可否学会公平?”

*Later on, we’re joined by Junko Yoshida, EE Times’ chief international correspondent, and EETimes executive editor Dylan McGrath. The two editors moderated panels at VerveCon in sunny Santa Clara. They share their observations at this unusual tech conference, where the main auditorium was filled not by male but female engineers.
* 稍后,EE Times首席国际记者Junko Yoshida和EE Times执行主编Dylan McGrath加入我们。两位编辑在阳光明媚的圣克拉拉VerveCon主持了一个专题研讨会。他们将与大家分享这个不同寻常的技术会议的讨论热点,这个会议的参会者不是男性,都是女性工程师。

All that to come, but first, here’s Rick Merritt, EE Times Silicon Valley bureau chief. Rick, who attended Google I/O, met up with Dylan afterward. Here are our two editors discussing the highlights of this year’s Google developers' conference.
所有这一切都将一一呈献,但首先有请EE Times驻硅谷记者站主任Rick Merritt。他参加了Google I / O大会,然后与Dylan会面。以下是我们的两位编辑对今年Google开发者大会亮点的精彩讨论。

以下内容为英文原稿,Enjoy!

DYLAN McGRATH: Rick, I understand you attended Google I/O today, and can you give us kind of a high level view of what you saw, what you heard, how it went?

RICK MERRITT: At the highest level, the script that Google CEO Sundar Pichai had about AI everywhere and his high concern for privacy could have been the same script that Mark Zuckerberg used last week at the Facebook developer conference. They're both really concerned about the increasing government scrutiny about how they're selling and sharing and what they're doing with people's personal data, particularly as they're mining more and more, but with AI.

So the focus was, "We're good players here. We're really concerned about your privacy." And there was almost more concern about these social issues than there was about commercial stuff.

Though some news did get announced.

DYLAN McGRATH: And what was some of that news?

RICK MERRITT: Google came out with some sort of middle market pixel smart phones and a new home display to enhance their smart home story. But really, the underlying thread to all of it was that they're doing more and more AI everywhere, and they're trying to do more of it on device, so that you don't have to go to the Cloud. And that supports their privacy story.

I think the interesting thing there, though, is you can see where they're wanting to get some commercial advantages, so Google did demo some work on voice interfaces that are getting better and better on the smart phone. So they showed somebody being able to walk through multiple applications and do a mixture of commands and dictation. And their assistant understood what they were doing in the canned demos, and did a pretty good job of it. So their comment was, later this year, the software's going to roll out for their pixel phones, and the voice interface will be faster than using a touch screen display. That's a significant advantage for them.

DYLAN McGRATH: Absolutely. Well, I guess the similarities between that and the Facebook event really show where Silicon Valley's head is these days with regard to these privacy issues.

You wonder if it's just lip service and if they're serious about this, or if they're just trying to stay out of trouble. Off the government's radar.

RICK MERRITT: Yeah. There's definitely a lot of lip service there and some big fines coming up, but we'll see.

DYLAN McGRATH: Okay. All right, Rick. Well, thank you very much.

RICK MERRITT: Good talking to you, Dylan.

DYLAN McGRATH: Good talking with you.

DAVID FINCH: And now, Junko explains why EE Times has gone after the loaded issue of “fairness” in AI. This ambitious Special Project, shepherded by EE Times senior contributing editor Ann Thryft, digs deep into the challenges of AI. Predictably, Ann found out that most marketers of AI systems-- or so-called “AI solutions”-- prefer not to talk about fairness. However, AI researchers were much more candid.

JUNKO YOSHIDA: This year, Rick Merritt, EE Times Silicon Valley Bureau Chief, put together a stellar Special Project, It's Still Early Days for AI. Rick covered a wide range of AI issues from deep learning models and neuro networks to AI chips for both learning and inference.

We follow AI aggressively because we know big changes in AI will impact next generation computer architecture. Our job at EE Times is to be there when the impact hits so we can explain what happened and what comes next.

This week, we launched a new Special Project on AI, shepherded by EE Times Senior Contributing Editor Ann Thryft. This time around, our focus is on the fairness of AI.

Unlike the topic like AI performance-- which is typically explained and measured in teraflops, gigahertz and watts, we decided to pursue this more elusive concept of fairness. Why, you may ask? Because in the pursuit of automation, humans are beginning to cede to machines whole realms of decision making for tasks like hiring, credit scoring or customer services, and even driving. When we take people out of the loop, we assure ourselves machines are more efficient. They do the job faster, cheaper and more accurately, without the blunder, bias and fatigue that afflict mere mortals. Or do they?

In recent years, AI researchers have realized AI is not so accurate as it's cracked up to be. As Ann Thryft, born skeptical, points out. Most of the fairness of an AI decision making depends on the accuracy and completeness of the test data sets used for training algorithms. This is an elegant way to say garbage in/garbage out. A machine's decision also depends on the accuracy of its algorithm itself and how it understands success. An optimization strategy by training algorithms can actually amplify by us sometimes.

The black box nature of such algorithms is also worrisome, making it almost impossible for humans to explain how machines reach certain conclusions.

The engineering community has made remarkable progress with AI. We all applaud it. But we pause the clapping when we hear the AI developers say, "AI works most of the time." Most of the time isn't good enough. People will expect the machines to decide soundly, safely, accurately and fairly. People hold other people accountable. They expect no less from AI. In engineering, we often say, "security by design." We recognize it's high time for an engineering community committed to AI development to start thinking about fairness by design.

Our AI Fairness Special Project includes real world cases of bad AI behavior, coupled with discussion of an emerging framework and potential standards that define AI fairness. We ask, and answer, whether AI fairness can be regulated. We also explore tools-- although not many yet-- under development designed to de-biasing or auditing algorithms and data sets.

Author Ann Thryft poses five big questions about fairness of AI to researchers from MIT, Stanford and IBM. The bottom line is simple: We can't just assume AI will be any fairer and more accurate than its human parents. We need designers of AI software and hardware to start thinking about fairness before they embark on their next AI project. You can't add fairness to your system as an afterthought.

This is Junko Yoshida, EE Times.

DAVID FINCH: Sally Ward-Foxton, EE Times European Correspondent, who also contributed to our Special Project on AI fairness, explains now, more specifically, how financial institutions are increasingly using AI-- particularly in machine learning-- to make decisions on credit scores, credit risks and lending, and where bias creeps into the process. Here's Sally with more.

SALLY WARD-FOXTON: Financial institutions have embraced AI and machine learning technology to determine consumers’ credit scores and decide on their loan applications, because the technology can consider large amounts of data and make quick and accurate decisions. 

The trouble is, even though there are no people involved in making the decision, studies have shown that these systems can still exhibit unintentional bias against minority groups. This is despite the law in the US that makes this type of discrimination illegal. 

When we’re talking about consumer credit scores and loan decisions, there's obviously a lot at stake. When you’re deciding who gets a loan, you might be deciding whether that person can own their home, whether that person goes to college, or whether that person can cover their medical expenses. If the decision-making process is biased against any group of people, there are big implications for society as a whole. 

So how can banks check for bias in their model, and how do they fix them? These systems consider thousands of variables from each applicant and use techniques like neural networks to model complicated interactions between the variables. As these techniques develop and evolve, the complexity of these models will only increase. In other words, it can be pretty difficult to tell where that bias is coming from, and it’s only going to get harder. 

I spoke to AI model fairness expert Jay Budzik. His company, ZestFinance, uses mathematical game theory to analyze banks’ models. They can determine which variables are driving the bias, and then they can tune the model to make a better trade off between accuracy and fairness. 


So there are ways of making AI fairer. The real question is, will banks choose to use them? These models are highly optimized for accuracy, to make the banks the most profit, which implies that changing the model in any way might mean they don’t make as much money. So fairness, despite the law, may well be a difficult sell. 

This is Sally Ward-Foxton reporting from London for EETimes.

DAVID FINCH: At VerveCon, a conference devoted to women in tech, Junko and Dylan worked together as moderators of a keynote session and a roundtable. Their panelists were, in every sense of the word, "the best and the brightest" in the tech industry today, including distinguished engineers and engineering directors from companies such as Google, Microsoft, Oracle, Linkedin and Intuit.

Here are Junko and Dylan after the conference.


JUNKO YOSHIDA: This a rare occasion I happen to be in Silicon Valley, and I'm with Dylan McGrath, Executive Editor of EE Times. And we just came back from the conference called VerveCon, which is women in tech conference. This is the second year of that conference, and we had about 800 people I think.

DYLAN McGRATH: Give or take.

JUNKO YOSHIDA: Yeah. What was your impression, considering that the 800 people who were there-- probably 99% of those people who came-- were all women?

DYLAN McGRATH: Well, it was quite a difference from most of the conferences, most of the engineering conferences I attend. I fact, I had a conversation with someone in line: basically there were only a handful of men there, and she made the point that this is how she typically feels when she goes to a conference. And it really was quite the reverse. I've never been to a conference like that. So it was eye-opening.

JUNKO YOSHIDA: Yeah. This is a conference, actually, I think it serves two purposes. One is more of a career development conference, but also the founder-- Sudha is the founder of this conference-- she believes in continuous education. That means that there are a lot of technical sessions to develop your career, so everything from AI to block chain to natural language processing. There are a lot of tech seminars, too.

DYLAN McGRATH: Absolutely. That was one of the things that surprised me the most, was that the content at the conference was not just for women.

JUNKO YOSHIDA: Exactly.

DYLAN McGRATH: It wasn't all built around being a woman in technology. A lot of it was just straight technology. And in that sense, other than the kind of demographic of the attendants, which was mostly women, it wasn't that much different than these other conferences.

JUNKO YOSHIDA: Exactly. But another thing that was interesting to me was, Dylan and I moderated two panels, and both panels were excellent, but I felt like we were able to see a little bit behind the curtain of top technology companies in Silicon Valley, what's going on in the work environment, how they actually grow people within the company. You know, for example, like Google, right?

DYLAN McGRATH: Yeah. Yeah. Fascinating to see. I mean, they obviously have their own distinct culture. And I also found, when we talked about this afterwards, one of the most interesting things that was discussed was how being at a company like that, you're just surrounded by the best and the brightest of people who have excelled throughout their college and early career and have always been the smartest person in the room. And now this is the first time that they're not.


JUNKO YOSHIDA: This is the first time!

DYLAN McGRATH: And that's something I never really thought about before.

JUNKO YOSHIDA: Yeah, they're all kind of homogeneous, right? They went to the top schools. They probably haven't had any experience of big failures or anything. So they come to Google, and this is the first time they realize they're just the middle path.

DYLAN McGRATH: Right. Yeah. So that was quite eye-opening.


JUNKO YOSHIDA: So how you differentiate yourself was one of the big conversations.

DYLAN McGRATH: And the answer was given, It's resiliency that makes a leader. Someone that is... Not that they're afraid to fail, but someone who WILL fail and will continually get up and try again.

JUNKO YOSHIDA: Right. And also, I think the emphasis was that, in order to be a leader, you really need to find your allies, right? Whether you are career hopping within the company, you always need to find your allies, your mentors, and then if you're lucky, you get what they call "sponsors." Meaning that somebody who can vouch for you, who can talk up. And it's not a formal relationship, but it seems like the culture is there to help each other.

DYLAN McGRATH: Yeah. And I think that was another one of the really eye-opening parts about the conference. Again, it did focus a lot on career development and the importance of having a good mentor, a good sponsor, someone to serve as a sounding board and help guide your career, help you guide your guide your own career. That's a very interesting concept.

JUNKO YOSHIDA: All right. Well, thank you very much. It was good to see you.

DYLAN McGRATH: Good to see you, too, Junko.

JUNKO YOSHIDA: All right. Thanks.

DYLAN McGRATH: Talk to you soon.

DAVID FINCH: And finally, it's Mother's Day weekend here in the US. And we conclude with a story about saving the lives of mothers and children in underprivileged areas in this piece sponsored by Arrow Electronics.

Recently, I was joined by Victor Gao, Chief Marketing Officer at Arrow, to talk about a project called The Solar Suitcase, which was a joint development project between Arrow Electronics and We Care Solar.

Victor, welcome. And please tell me a little bit about this project. Why The Solar Suitcase?

VICTOR GAO: So every year, more than 300,000 women die in childbirth because they happen to go into labor at night when it's dark, and there's no electricity. So they could bleed out, they could be infected, what have you. And a solution for that is really just lighting. And as we looked at what we do really well at Arrow, we work with We Care Solar to design a suitcase that essentially is a solar battery, solar-powered battery.
更多...




——听全球专业科技资讯,只在EETimes


用户评论

表情0/300
喵,没有找到相关结果~
暂时没有评论,下载喜马拉雅与主播互动
猜你喜欢
谷歌SEO

不管你是SEO的新手,还是想学习高级策略的老鸟,这里始终是您的SEO知识学习中心。

by:CrossBorderDigital

谷歌赚钱

要说竞争者最少的,当属海外的互联网市场,毕竟信息最为不对称。要说海外互联网中用户最多的,当属谷歌。相信包括听友的你,也不太了解谷歌上是怎么赚钱的,也心存怀疑,但...

by:精神小鱼

谷歌创业帮

当来自最伟大公司的年轻人遭遇最好的创业时代,他们的内心会迸发出怎样的火花?谷歌是公认世界上最伟大的公司之一,是天才最密集的地方。一批从谷歌走出来的华人天才创立...

by:无离_f1

谷歌团队五法则

日更5集,不定期爆更!订阅可以收到更新提醒哦~【内容简介】如何改变团队缺乏创新、成果提升不上来的企业现状,成为众多企业迫在眉睫的关键问题。对此,谷歌...

by:浙江人民出版社电子书

Y.I.Y.O.-.大风吹

三人缘来…曾在名歌餐厅中表演,四处赶场,只为了争取有"唱歌"的机会,曾与出道前的"锦绣二重唱"合作过,...

by:紫音旋律的微频道

声音主播

1841525

简介:延承世界各大通讯社严谨公正、影响甚钜的优良专业传统,ASPENCORE 英文广播服务自2018 年秋为全球听众呈献首期广播摘要节目《电子工程专辑全球联播》,运用深度、长篇的新闻采集和分析手法和风格,释义科技进步与人类生活品质相辅相成的发展,实时记载全球科技界同仁为创新而孜孜不倦、互勉互进的共同记忆。