Why Are We Watching the Same Show?

Live Augmented Streaming and the Future of Interactive Consumption

As we drift deeper and deeper into our Zoom reality, there’s one important part of the video experience that for some reason hasn’t been exposed yet: why don’t we get to choose exactly what we’re seeing during live video?

There is certainly a time for lean-back, zero-control environments that viewers seek out; just look at the rise of channel services created for cord cutters, like Xumo or Pluto. Not everyone wants to click mindlessly through tiles for an hour, just to realize that Jigsaw was the best option for Halloween movie night.

But in live video, we’ve always deferred to the creator to choose the best viewing experience and that’s inherently a flawed notion.

The pandemic has accelerated live digital video consumption to a place where it is now far more commonplace than it was even a year ago — in fact in early April, when many of us were in Quarantine, it was our only connection to the outside world on a daily basis.

But here we are, in the middle of a time where you can’t readily go to a movie, concert or sporting event, and consuming live video, especially on mobile, feels increasingly like a natural consumption pattern. And yet, there’s a seemingly misguided expectation that:

  1. Everyone needs to watch the same feed of a specific video
  2. That feed will be presented as if you are watching on a 60" TV

So let’s take a look at what interactive video features exist today.

From a personal live video perspective, Apple has added the ability to somewhat control how you interact with your friends whenever you FaceTime one another. You can apply filters, memojis, or stickers within the call, driving you to spend more time connecting with them, while personalizing and memorializing each call.

If you want, you can then take your favorite creations and share them to an existing social feed, a creative symbol of your relationship with each friend.

Snapchat and Instagram too have long incorporated filters, stickers and other animation for creators to send short-form personal on-demand videos, but those services aren’t inherently viewed as live platforms.

On the other hand, Twitch is very much a live platform and has built viewer participation into the service as a native feature. From tipping to sound effects, custom commerce to virtual good exchanges, Twitch provides a layer of interactivity that enables viewers to engage in the overall stream experience.

Let’s say we take all of these concepts away from the current ecosystem and apply them to a potential future state.

If you’re live streaming a Selena Gomez concert, what if you could drop her into your favorite sweatshirt when she sings your favorite song?

You could use your phone’s camera to create a memoji of yourself singing alongside her in real time and maybe even change the background to your bedroom, so that it looks like it’s the two of you putting on a show in your house. You could then record your experience and share it to your feed so that all of your friends could see exactly how you watched the concert.

Meanwhile, your friends would likely be doing the same, transforming her into the version they want to watch, so your social feed would likely be filled with dozens of different Selena’s based on how all of you envisioned and created her for your personal concerts.

In many ways, your vision of Selena in that moment is actually the natural next step in the realm of narrowcasting, where content right now is recommended and delivered to you on a 1-to-1 algorithmic basis. Instead of having an algorithm serve something to you that you might like, why not turn any piece of live content into exactly what you want?

There are 2 recent examples where these tools appear to be on our doorstep and musical performance seems to be the appropriate video vehicle for their ultimate delivery.

Travis Scott’s performance within Fortnite gave viewers a glimpse into an interactive and virtual consumption world. With visual effects that transformed and reimagined the concert environment, viewers could move around within the space while a Godzilla-sized rapper belted their favorite songs.

Similarly, the Billie Eilish “Where Do We Go” PPV livestream last week created a remote event experience that blended interactive fan experience with stunning visual effects. Streaming from a stage in LA, Eilish and her production team used XR technology and the Maestro interactive platform to sing among surreal backdrops, while fans had the chance to bid on merchandise or chat with each other during the show.

Custom interactive live video has never really existed at scale before, but Gen Z and Gen Alpha are seeing the power of 1-to-1 content and, at some point, they’ll want more creative tools in order to make an experience that is inherently their own — perhaps let’s call it Live Augmented Streaming. Right now, all we’re giving them is the same, tired content outputs that we’ve had on linear cable for decades.

How does that make any sense?

Media and Tech junkie. Always trying to find what’s next. Baseball addict. True Detective apologist. Old Bay evangelist.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

READ/DOWNLOAD@< How to Build Max-Performance Mitsubishi 4g63t Engines FULL BOOK PDF & FULL…

4 Ways Technology Helped During Hurricanes Harvey and Irma (and 1 more it could have)

The Cost of a Printer is not the Printer

Cyber Monday 2020 is expected to be record-breaking

What I Use to Be A Mobile Creative

Planning for Transportation Technology: A Get Started List

How to Buy the Right Ergonomic Keyboard

Why was Nature Remo, smart remote controller, created?

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Whit Harwood

Whit Harwood

Media and Tech junkie. Always trying to find what’s next. Baseball addict. True Detective apologist. Old Bay evangelist.

More from Medium

A Voice UI to Optimize In-Car Navigation

Contextual Inquiries: The Foundation for Digital Transformation

#1 | From Publishing to Product — with Michelle Lock

Adam Chernick on AR based Communication for Architecture and Urban Design