AI Content Chat (Beta) logo

120 Kellogg Insight tools available that can really move things, you have an obligation to understand the larger impact. SUH: Accountability is one of the five areas that we are focusing on for creating trust in AI. Many businesses are applying AI to not just create better experiences for consumers, but to monetize for profit. They may be doing it in ways where, say, data rights may not be balanced appropriately with the return on economic value, or efficiency. So it’s an important discussion: Who’s accountable when there are risks in addition to benefits? ZETTELMEYER: Do you think this is new? SUH: I do a little bit, because in previous scenarios, business programs and applications were programmable. You had to put in the logic and rules [explicitly]. When you get into machine learning, you’re not going to have direct human intervention at every step. So then, what are the design principles that you intended? ZETTELMEYER: So a fair way of saying this is, in essence, we’ve always had this issue of ownership, except with machine learning, you can poten - tially get away with thinking you don’t need it. But you’re saying that that’s a fallacy, because you do need accountabil - ity at the end of the day when something blows up. SUH: Exactly. And this goes back to [training an algorithm to have] a fun - damental understanding of right and wrong in a wide range of contexts. You can’t just put the chat bot into the public sphere and say, “Here, just go learn,” without understanding the implications of how that system actually learns and the subsequent consequences. Based on insights from Florian Zettelmeyer and Inhi Cho Suh

The Marketing Leader's Guide to Analytics and AI - Page 120 The Marketing Leader's Guide to Analytics and AI Page 119 Page 121