Google: Audio should be a “first-class citizen”

Picking up from yesterday’s Part 1 of the conversation between Pacific Media’s Steve Pratt and Google’s Podcasts Product Manager Zack Reneau-Wedeen, today’s installment in the five-part series has Reneau-Wedeen evangelizing audio and planning its rise in Google information ecosystem.

“There’s no good reason why audio isn’t a first-class citizen,” he said. The Google executive was speaking mainly about podcasts.

As we covered yesterday, Google has already stealthily implemented a remarkably streamlined podcast discover-and-listen process integrated in Google Search, one of the most used functions on the internet. Search for a podcast title, and Google presents a playable module which includes the three most recent episodes, touch linkage to a two-month list of archived shows (each one directly playable), and another touch command to put that bundle of results onto the Android home screen. Our testing showed this to work perfectly. The product eliminated the act of opening a podcast app, which suddenly seems cumbersome in comparison.

A second phase of this streamlining is more difficult to accomplish — namely, delivering podcast results for search queries that match podcast content. Currently, that need is met mainly with show transcripts posted to the program web page. that’s not a bad solution, but a time-consuming and sometimes expensive workaround to the internet’s basic inability to natively understand spoken-word audio. Cracking that nut is Google’s ambition.

“In the longer term, integrating with Search means figuring out what each podcast is about and understanding the content of that podcast,” Reneau-Wedeen said. “This is something Google has done extremely well for text articles, as well as for images and even more structured data such as maps. We can help with audio, too.”

See Part 2 of Steve Pratt’s series HERE. Part 1 is HERE.

Brad Hill