Bose AR Unity Workshop

Flash and JavaScript are required for this feature.

Download the video from Internet Archive.

Description: Filip Baba, Senior Developer Advocate at Bose, walks students through a tutorial for creating programs that work with Bose Augmented Reality glasses.

Speaker: Filip Baba

[CREAK]

[WHOOSH]

[TAPPING]

 

FILIP BABA: All right, so welcome, everyone, to our Bose AR Unity workshop. My name is Filip Baba. I'm a senior developer advocate at Bose.

So what that means basically is I work with developers, enabling them to build experiences for our platform. I host workshops. We do game jams, hackathons, also public speaking on behalf of the platform. And if you do end up building something with Base AR Unity SDK specifically-- I'm on the Unity side-- you'll most likely be talking to me, and I'll be helping you get your idea and your product out there and trying to bring Bose on board to help you out with that as much as possible. So I am your voice when it comes to these things.

So I'm just going to talk a little bit briefly about what is Bose AR. Some of you may already know or not, but I'm just going to get into it. So we have a new concept called audio augmented reality. Now, many of you probably already heard about augmented reality. First thing you probably think about is hololens or ARKit ARCore, Magic Leap, stuff like that.

So we are actually audio first augmented reality. Now, what that means is, think spatial audio, for example, where you have sounds placed in space around you, and you're kind of virtually hearing it spatially. So what enables us to do audio augmented reality are the sensors that are located in each of these Bose AR wearables.

You'll have a pair of frames, but then we also have our headphones as well, which have the same sets of sensors in them. And those sensors are an accelerometer, which allows you to get impulse and sudden movements; a gyroscope, which tracks rotation, so typically what we use the gyro for is to track the user's head orientation; and magnetometer, which is basically a compass, and we use that for our navigation apps. So if you know the GPS position of a user based on their phone, and if you use the magnetic heading from where their head is actually like pointing at, you can know what they're looking at on a map pretty accurately, actually.

So we have gestures as well on our devices. So the frames specifically have an input gesture here on the right side. It's a double tap, so it's not a specific, touch sensitive sensor. It's actually based on the accelometer, so it's the little impulse tap. And the tap actually works from any side, but it's recommended to do it from the right side because that's where the sensor kit is.

We also have custom gestures that we actually encourage our developers to explore, and it's a mixture of the other sensors. So a custom gesture would be something like, you're creating an experience, and it tells you, look up, and you look up, and then you track that rotation change. That would be a custom gesture. I know some of the games that have been built, sometimes, it tells you to duck, and there's a bullet flying over you or something like that. So that would be a custom gesture.

So Heads Up Hands Free is one of the models that we have. So the idea behind it Heads Up Hands Free is, a lot of the applications today, whether they're AR or just standard mobile applications, you're always staring at your phone. You're always walking around. Your neck as a little craned because you're always looking down at your phone.

The whole idea behind Heads Up Hands Free is that we want to create applications that you don't have to stare at your phone. The input is the wearable, and the output also goes through the wearable. So you don't actually have to look at a screen for most of these activities.

So I'll give you an example. Let's say NaviGuide, for example. What it does is it knows your GPS location based on a mobile device, and it knows you're heading based on where you're looking with the magnetometer. So what it can do is then it can tell you're looking at this restaurant, for example, or this cafe. You double tap the input gesture, and it'll actually pull a Yelp review.

Now, think about the non Heads Up Hands Free way of pulling up a Yelp review. What are you doing? You're most likely googling the name, or you're typing in the street address. But it requires you to pull out your phone, launch the app, find a name, type it in or the address or the pin on the map, and then you get to that Yelp review.

It takes you out of what you're doing. So if you want to look up five or six restaurants walking down the street, you're basically probably staring at your phone for like 15 minutes. The idea behind Heads Up Hands Free is you just walk by a place, tap, you get the information, you move on-- no phone required, at least not looking at it.

Now, it's a really cool concept I think, and a bunch of our apps use that as its guiding principle. It's not a must, but it's a design philosophy, and some of the developers have been really clever about implementing it. So the way that it works is the sensors-- actually, the data gets sent to the mobile app, so the wearable has to work with a mobile phone, iOS or Android, and that goes for the frames and the headphones as well.

So some of the products that work with it are the frames, for example, which you all have here. I'm going to show just a little brief intro in frames.

[VIDEO PLAYBACK]

[MUSIC PLAYING]

 

(SINGING) I look good. Good, good. Say, I look good. Good, good. This is sunshine dripping down my face.

[END VIDEO PLAYBACK]

So the frames were actually a concept that came out of our San Francisco office, so it's a very different kind of idea behind it. I, personally, think it's a super cool idea. I think frames are a really great outdoor wearable.

It's great while you're biking, for example, when you don't want to be interrupted. If you don't want to-- let's say you get a call. You could just pick up straight through the frames. You don't have to pull out your phone.

Or you could do turn-by-turn navigation. Again, you don't have to be mounting your phone, putting a waterproof case or whatever. You just have your frames on.

Also, it doesn't plug up your ears. It's open ear audio, so you can still hear the sounds of the street around you. And it's not on the safe in that way, like wearing like noise canceling headphones while biking, for example.

I just realized I got to do this QuickTime thing. Maybe click there. There we go.

Next up-- oh, skip that-- QC35 II's. Now, this product you might be more familiar with. You've probably seen people wearing these, usually maybe on an airplane. So this is the classic noise canceling headphone which also has Bose AR sensors in it. And a lot of people actually don't know that, so I believe it's all the models after October of 2018 were manufactured with the Bose AR sensors in it.

And then the NC700's, which are these-- so these are the brand new noise canceling headphones that just came out. I get asked what's the difference between the QCs and the NCs. There's a few.

Obviously, there's a redesign. It looks a lot more modern. It has a little bit more of a sleek aesthetic. I think that the QC35 have a very dad road trip look to them, especially the gray ones.

But yeah, it's a brand new redesign, and one of the main upgrades, I'd say, is the microphone. So for conference calling, it actually has a noise canceling microphone because it has eight beamforming mics. So it can actually pick up your voice as opposed to the environment, so I found it really good in loud environments. They've actually done comparison tests in a loud environment, and you can barely hear anything in the background while you're talking with them. So I personally really like that because I do a lot of conference calls in sometimes noisy environments. There's a little promo video associated with that, too, if you want to see.

[VIDEO PLAYBACK]

And these are the first product that was-- the first Bose product that has been marketed with the Bose AR features in it. So the whole idea behind the NCs is, again, the Heads Up Hands Free.

[MUSIC PLAYING]

 

FILIP BABA: So I don't know if you saw near the end of the video, but there's a little capacitive touch screen here on the side. So it's not touch screen, it's a touch--

AUDIENCE: Surface.

FILIP BABA: Surface, yeah. And it's, yeah, you can do volume up and down, skip track, or go back. And then there's also touch and hold, which you can actually get as a gesture inside Bose AR when you're building your app. So again, the whole idea is, you're not pulling out your phone. You have your wearable, your AR-enabled wearable, which is your headphones, which you carry around with you. If you're like me, I carry around my headphones everywhere. So you can talk to your Google Assistant or to Siri through it, receive messages. It'll actually read out the messages to you.

So again, the whole idea is to have this wearable be pretty much your main input and output device for your daily mobile uses. So one of our projections is that we're going to have one million AR devices in 2019 across all three of the products that I just showed you. Now, that's actually really huge, because most AR is single purpose AR.

If you get a hololens, it's more enterprise. You're not really going to be wearing it out on the bus, you know. Or a Magic Leap, same thing, they only work in certain environments. It can't be in bright sunlight. And AR, typically, is a very niche thing that's found in specific places for specific use cases.

But the idea behind Bose AR is that these wearables are actually being worn by people every day, already, just for their regular life, to listen to music, to do their conference calls. But now we have this AR platform built into it as well. And developers can actually leverage a very big install base. Because there's a lot of these devices out there. And they're constantly being sold and the user base is growing.

And they're going to be looking for Bose AR enabled apps. Well, what can I run on these new headphones that I just bought? So we have an iOS and Android native SDK. And then we also have a Unity plug-in, which is what we're going to be using today, which actually works with both iOS and Android.

So I'm just going to go into a few of the use cases, a few of the apps that are here, that might be a little bit inspiring so you can kind of get an idea of what people are doing. So Otocast is a really cool concept behind, like, how you can-- let's say you're at a music festival. They could set up a tour where you actually hear these audio pings.

So if there's a point of interest as you're walking by, you'll just hear, like, a spatial audio pinging. And it's very unobtrusive, it's not like it's yelling at you. Hey, there's the thing here.

It's just kind of like, it's a ping. And then if you look at that ping, you can actually double tap to enter it. And it'll tell you, oh, this is this stage. Or maybe if you're doing like a historical tour, it'll tell you this is this landmark, would you like to enter this tour? And then you enter it. And then you can go and actually explore and it'll kind of tell you what these environments are.

Walc is kind of a similar concept, also to Otocast and Naviguide. And Naviguide, I explained earlier. It's kind of like you're using a gesture to Yelp things, basically. So it's really cool to kind of, like, not bring you out of your travel experience. You're just kind of walking, and exploring, and getting information you need without, again, staring at your phone.

So all of these kind of use initially in the sensors what I said. So you the GPS from the mobile device. And then you use the magnetic heading from the wearable. And what's really cool about using magnetic heading from the frames, people always wear their sunglasses perpendicular. It's not like your phone, where the magnetometer sometimes-- you ever use the map and you're trying to look at the compass on the map, and it's opposite, or it's not very accurate? And you got to, like, do this and you got to recalibrate your phone. So with the frames, it's actually a lot better because you're never really putting that in a wrong direction. It's always kind of facing perpendicular. So it's really good to get a heading.

Sport and wellness, so another use case is some of these sport apps. So Golfshot's actually a really great example. People wear frames when they're out on the golf course. It's sunglasses, they might be listening to music as well, so it's a perfect fit. So Golfshot's an already established app that golfers use to get information about but their shot, and their stored game details. I'm not much of a golfer so I don't really know all the details. But I do know that they have a Bose AR enabled feature that actually allow you to measure things like distance to the next hole, again, without having to pull out your phone and typing in some information. It uses all the data from the sensors and from the mobile device. It knows where you are, it knows what course you're on. And then you can have this seamless experience, again, without pulling out your phone. You're just already wearing some glasses because you're playing golf, and it's sunny, and you know. Yeah, I hope I makes sense.

So New Balance I think is a really cool one. This is a demo. So they kind of created more of a workout app. It's kind of like a motivational workout app, where it'll count your sets and reps. So think, like, you're doing sit ups and you're actually doing that custom gesture. And you're counting how many sit ups you did. And it gives you positive feedback, oh, you're doing great, two more, or something like that. So again, if you weren't doing heads up, hands free, maybe you'd be checking your phone to see how many you've done. There's really no way to kind of do it seamlessly unless you're kind of wearing a little extra piece of sensors on you.

Headspace is also a demo. And this is actually getting a big revamp. But the idea behind this, it's a meditation app. So it's a perfect fit for things like spatial audio. They even have a little bit of an exercise you kind of like stretch your neck.

And then again, it's using the sensors in the wearable to actually be able to tell if you did your neck stretch. Then when you do your neck stretch, it kind of has this soothing voice. It's like, all right, so now, we're going to close our eyes and, you know. And it starts to play a little soundscape. And it helps you kind of meditate.

Another idea I heard brought up is, you can actually check someone's posture. So if someone's falling asleep, you can kind of slightly ping them to wake up, so things like that. Like I said, developers have been really creative with it.

And then music of course, I think music is one of the best use cases. Because I think spatial audio, which is what today's workshop is mainly going to be about, is a very unexplored area of music or sound in general. And Bose AR is a perfect fit for spatial audio, especially because of the sensors that I was just mentioning because you know the head orientation, you can actually simulate spatial audio just between the two stereo speakers. You can actually blend the sound between based on where your head is looking.

So Radar is a great example of this. So Radar is an app built by Bose, which actually connects to something that's called Creator Tool that's currently in development. It's in closed beta.

But the idea here is that Radar is going to be kind of almost like a marketplace slash area where you can browse lots of different spatial audio experiences. And they're going to be tied to certain areas on the map. So maybe if you're in a certain place, there's certain spatial audio experiences that wouldn't be in some other place.

And it can do things like it can ping you when you enter, let's say, some park. And maybe there's a spatial audio experience tied to that park. And you'll be able to enter it if you just kind of have the app running in the background.

Traverse is a cool example of a mixture between RAPI and existing ARAPI. So I think they're using ARKit. And what it actually allows you to do, it's kind of like you're in the middle of a concert. And you're hearing the different instruments playing around you.

So what you can actually do is, while you're holding your phone up, you can actually kind of close your eyes you just walk around the room. And you're actually exploring all the different instruments around you. You can walk up to the singer, you can walk up to the piano. And it's using ARKit to do positional tracking and Bose AR for spatial audio and your head orientation.

So it's actually using both the phone's positional and the head's rotational tracking. And it knows between the two. And it kind of simulates as if you're actually in the middle of a band or a concert that's playing.

iHeartRadio is a cool demo app, I'd say. It's basically another take on how you can tune in to different radio channels. So instead of pulling out your phone and seeking, you're actually using your head orientation. And you're kind of, like, your head's the tuner. And then when you like something you just kind of double tap and you go into it.

And then you can pull out of it at any time and tune into something else. And if you get a little preview and you like it, you tap into it and you're listening to it without having to look at oh, I'm 22.3 FM, or whatever. You're just kind of listening to it.

Games is predominantly in the realm of our Unity SDK. So we have some cool games that have been built. The three that I have up here are what I would call representative experiences. So we actually have a lot more games that have been built, especially this year.

And I'll show you actually some videos of the games to kind of get inspired. So all these games are actually out. So you can download them on the App Store if you have a Bose AR enabled wearable.

And like I said, they're representative experiences. Meaning that I believe these developers kind of created some design patterns out of these. They were one of the first to make games for it. And I think these three are kind of in their own category.

So OverHerd, it's a really funny game. You basically play as an Englishman that's taunted by Frenchmen at castles at night. It's very kind of, like, Monty Python-esque. Funny accents, the voice acting is hilarious.

And what you do is you use your head as, like, a catapult. And your head orientation like this. And you double tap and you basically launch cows into these castles. And you kind of hear the moo. And it's all spatial. So you're kind of listening for where they're at.

And when you're turning your head, you hear the castles over there. And then the Frenchmen are, like, taunting you. Like, oh, I bet you can't hit the broad side of a barn, or whatever. I'm not going to try to do the accent, but you should hear it. It's pretty hilarious. And it's a great example of, it's kind of using the spatial audio and using your head orientation as the controller, basically.

So Komrad is a cool design pattern. So it's basically like a spy novel. And it's a choose your own adventure spy novel. So you can actually make decisions in the story.

So the story goes on and there's yes and no decision trees that you go through. And at some point I think you have to duck and dodge a bullet. And it's got this whole AI theme to it. And the AI's talking to you and you're making decisions. And you kind of change the outcome of the story.

So that's a pretty popular design pattern, and one that I'm a huge fan of. I love choose your own adventure games. And I think the medium is perfect for it. Instead of just doing yes or no on the screen, you can actually do yes and no by nodding and shaking your head. So it's very immersive. When you get asked a question you're like, yes. And then it kind of goes on.

Dead Drop Desperado is a multiplayer game. It's a local multiplayer game. So what I actually really like about this one is it's got a little bit of a party game kind of aesthetic to it. So you play with a friend. And one of you holds the mobile phone and the other one gets handed off the wearable, so let's say the headphones.

And what happens is he's standing in front of you. And you shoot bullets at your friend. And your friend is trying to dodge the bullets by listening to where they're coming from using the spatial audio. And you're kind of leaning left and right like that.

And then it gets switched off. And then you're the one who's dodging, and he's the one that's shooting. And then in the end, there's kind of a score tally and it's got a leaderboard. And it kind of has a little bit of a competitive aspect to it.

What I personally like is that you could, let's say, you have your headphones, and you want to show someone something cool on them. You could just kind of hand off the headphones. And now it's this multi-player interaction.

So you have the headphones, I have the phone. And you switch it up. And it's also a little bit competitive with the leaderboard and such.

So like I said, representative experiences. You're definitely not limited to these design patterns. We're actually excited to see what our developers come up with. And there's always cool stuff that people come up with.

And I just wanted to show you a little bit. So we have this video that was produced. So we've posted a bunch of game jams this year. And this is one of the first game jams that we hosted. And these games that I just showed you were actually built at it.

So I'll just go ahead and show you. So this is from our Playcrafting partnership. Playcrafting is a small, game developer collective. But plus they also host events like Play NYC.

And they bring developers, they teach Unity classes, and they just have a general kind of game developer community across the country. So we've partnered with them to have some of these Bose AR game jams. And I'll show you the video.

[MUSIC PLAYING]

 

ERIC HAMEL: We're here this weekend at Bose's offices in Boston. And we're doing a game jam with their new AR technology.

MICHAEL LUDDEN: We've given the developers a brand new version of the Bose AR SDK and Unity plugin. It allows Unity developers to use and add Bose AR integration to Unity based apps.

REJON TAYLOR-FOSTER: They've built a very intuitive system for developers like me to be able to just pop it into Unity and just go.

MICHAEL CARRIER: We give you access to all of the sensors that are available in the device, the accelerometer, the gyroscope, two different rotation sensors. They provide orientation of the user's head and world space. We can actually provide spatialized audio, which just gives you an entirely new level of context when you're making any sort of experience.

ERIC CHAN: We're able to get all sorts of different types of information from the frames glasses, which we can use to help develop the new types of interactions.

ERIC HAMEL: We were challenged to create an augmented reality experience that put audio first.

ANNA SHABAYEV: It's very interesting to look at games from, like, a whole different angle.

JONATHAN SHRACK: Usually people think AR and think, oh, it's all visual. And this one, it's Bose, so it's audio.

CRAIG HERNDON: The challenge really made us put our heads together and be like, what can you really do?

ANNA SHABAYEV: There's a whole world of sound. And we shouldn't just be paying attention to just the visual component.

MICHAEL LUDDEN: So we've had a lot of really creative ideas that the developers have come up with so far. And I think some of the more interesting aspects are the different game mechanics with the user interface experiments that they're doing.

MIKE LEVINE: The game we created, we're calling sonic samurai. You're the samurai and you can't see. But there are all these monsters around you. And you basically have to use the Bose headset to determine where the monsters are. And then you actually use your mobile device as your sword to strike at the monsters.

REJON TAYLOR-FOSTER: It's a game where you essentially try to dodge bullets that are sent from one user towards you. Imagine the matrix, but you can't see the bullets that are coming at you. You can only hear them.

MARC HARPIN: Our game is called OverHerd.

ERIC HAMEL: You play a medieval catapult operator, catapulting poultry and cows as French castles.

MARC HARPIN: You're doing this all at night, under cover of darkness.

ERIC HAMEL: So you have to use your ears to hear where the castle is.

MARC HARPIN: When you point your frames in the correct direction, you can hear the French defenders taunting you.

ERIC HAMEL: Daring you throw things at them.

FRENCHMAN 1: Your efforts are all wastes of perfectly good livestocks.

MARC HARPIN: By turning your head and elevating it up, you actually modulate the angle the catapult with fire at.

ERIC CHAN: You are being bombarded by vegetables in a 360 degree radius. You can tell what direction the vegetables are coming from based on spatial audio. And you need to find and face the correct vegetable and eat it.

JONATHAN SHRACK: We're working on kind of a spy thriller game.

CRAIG HERNDON: When you put on the Bose frames, you start to get radio chatter from a secret agent. And you start doing instructions to locate radar pings, diffuse bombs. Like James Bond and Mission Impossible in real life.

WOMAN 1: Duck now.

ERIC CHAN: Augmented reality is really one of the most important future mediums for design.

MICHAEL CARRIER: And we're getting the tools set up in a way that allow people to start experimenting. And using something like Unity allows that iteration to happen very, very quickly.

REJON TAYLOR-FOSTER: It's an honor to be chosen to be here to work on this technology for the first time.

ANNA SHABAYEV: And also give developers a bigger idea of, like, what's possible with this stuff so that they can use our games as a stepping stone for what's next.

DAN BUTCHKO: Here we are at PAX East 2019. All five of these games are debuting for the first time anywhere.

ERIC HAMEL: People are having a lot of fun here this weekend. Lots of people dodging vegetables and throwing cows.

MAN 1: I always enjoy, like, finding new games that are innovative.

MAN 2: I've never played anything just based entirely on sound.

WOMAN 2: I don't think I've seen anything else here that relies completely on audio and motion, no visual at all. But it felt really immersive and cool.

IAN CUBIN: People are really getting a sense that the audio is read and spatial.

WOMAN 3: You felt like you're really there.

MAN 3: It's just a whole different experience.

WOMAN 4: It's more social. It really brings us back to where games really are, where it's with other people.

MAN 3: It's a fun ride, these glasses.

DAN BUTCHKO: We're changing the landscape of AR and games in America. And this is just a taste of what's to come.

CRAIG HERNDON: It's very exciting for us to have that chance to impact other people's lives with a new technology.

CHUCK FREEDMAN: The things that people were able to do over a weekend really inspired me to think we're going to see some amazing stuff this year.

FILIP BABA: And we have actually seen some pretty amazing stuff this year. I feel like the more of these experiences get built, and the more developers get creative with it, it's an early platform so there's plenty of room to kind of set the standard at what could be created. What else do I have here?

So I have a few actual gameplay footage that I'd like to show you on some experiences that I personally like. Developers cut up these videos to kind of show what the games are like. So just going to get into these. And again, just some more inspiration.

NARRATOR 1: The Worst Grim Reaper's Soulmates it's an AR music game.

FILIP BABA: That's this t-shirt.

NARRATOR 1: In which a singing grim reaper plays a song that responds to you and your world. The grim reaper Sebastian is a character from another one of our games. He's tired of his dead end job collecting souls and wants to pursue his dream of being a songwriter. Seeking inspiration, Sebastian asks you to let him write a song about your life.

[MUSIC PLAYING]

 

SEBASTIAN: (SINGING) You looking up, maybe lost in thought, kind of like me, you know I daydream a lot. There are things on the ceiling and in outer space. There's magic in almost every place. Hey, would you like to share you thoughts with me? Wanting to learn about the inside of the human skull. I ask what's going on in your soul. OK, mm, are you maybe thinking about food? I still have a lot to learn about human kind.

AUDIENCE: [LAUGHTER]

SEBASTIAN: Would be on their minds. OK, so if it's not spaghetti or chocolate cake. So maybe the fact that the sun will someday explode and everything we know will go up in an ocean of flames? The human mind is less dark than I would guessed. I'll try to see inside of you once more. Hm, oh, is it someone you like? As in, like like?

NARRATOR 1: Imagine players using this app while commuting and exploring or just relaxing. Through interacting with Sebastian and his song, we want to show players that even the smallest things in life can carry meaning. Sometimes we just need someone to talk to.

FILIP BABA: Cool, right? It actually works. It's actually really fun. So one of the things that I personally have learned over overdoing some of these jams and seeing what developers build is that when you're not focusing so much on the visuals, you have a lot of room for story and substance, if I could say it so myself.

It's like this developer, was we had all teams of four. So three of those people were just content creators. So one we brought was Mitty. He was he was actually producing music during the jam. One was just doing like the singing and the voice acting.

I forget what the third guy was doing. I think he was doing, like, UI. Even though I do recommend to have nice, polished UI. But it doesn't have to be crazy, that's the thing. Like most games are very 3D.

And one of the things that's a problem, I'm a game developer myself, nowadays there's so many tools and so many things you can do, it's kind of hard to figure out what you want to do. Because you're like, oh my god, just overwhelmed with all these things that our modern devices can do.

So there's a lot of room for this kind of content to be produced. So this is kind of like a little AI kind of companion game, you could say. And yeah, it responds. And there's other things he didn't mention in the video. Like, it actually changes based on the weather.

So it actually checks the weather. If it's a rainy day, you start the game, you'll have a rainy day mood. If it's a sunny day, you'll have a sunny day mood. So it's kind of a replayable thing. And I don't expect people to be running this app, like, nonstop.

But I could probably see people running this for that one hour of their commute during the day, or maybe like for their lunch break or something like that, which is actually a lot, in terms of mobile app usage. So it's fun. And it's music, so you're probably already listening to music. Why not have a soundtrack to your life, for example? That's kind of the idea behind the game.

This one is also a really cool example. So I'll just let him explain it.

NARRATOR 2: That is about bopping and grooving your head to the music. Similar to other rhythm games, you score points by staying on the beat. At the moment, we have about a dozen custom written songs. In the future, we want the possibility for players to download hundreds of songs or play with their own music library.

[MUSIC PLAYING]

 

We are going to play Party in the Washing Machine. Our foxy friend here is grooving along as well. You could follow it and he helps you stay on the beat. The line below is showing your head movements. It also changes colors to indicate how long you're doing.

So we really intend this game to be a phone in pocket experience. We added both audio and tactile feedback. For example, the high pass filter turns on when you're doing bad. And your phone vibrates with the beat when you're doing great.

So the idea of Hip Hop Hero came from us wanting to make a game that naturally played with Bose AR defenses, something you can play while commuting on the train in public.

FILIP BABA: Yeah, it's like a play on words of Guitar Hero, kind of. You know, you're kind of staying on the beat. You get rated based on how well you did, how well on the beat you stayed.

And you don't have to have a specific head gesture. It's just kind of how you would normally bob your head. And they have a pretty good algorithm for detecting that change. They're using both the accelerometer and the gyro, but I'm not I'm not 100% sure on that.

But yeah, I thought it was great how they visualized it as well. So that's an actual graph based on your head movement. And then you get rated and, you know, it's yeah. I mean, you've probably seen games like this.

And this is the last one. This one I like because you're kind of producing music in a way. Now, it's basically song loops that they've created. But the amount of song loops that are in this app, for example,

I mean, someone here can probably do the mathematical permutations of how many combinations of unique songs you can get out of it. But it's a lot when you include, like, 10 loops, for example, for bass lines, as you're kind of adding complexity to a song. And if you have maybe, like, 10 of each samples and in the end you got, like, maybe 40 samples in total, you could be creating unique music on every play through.

NARRATOR 3: Choose your own adventure musical experience of layering song loops, set in a world where flowers produce music. Players listen, explore, and nod along to the music, collecting song loops like pollen, in order to blossom a musical garden.

The game is played by looking around to find new sounds, and nodding your head along to the beat when you find one you like. This process can go on indefinitely, allowing players to explore countless new ways of layering together song tracks.

FILIP BABA: It's basically, she totally nailed the explanation, like, in the beginning. But basically, yeah, you look around. You hear a preview of a loop, you like it. You nod it. And then you go on, and you're just layering loops. And it works really well. I mean, the developer really figured out a way to make it.

If you've ever tried to make music, sometimes you could just mess up and it just sounds terrible because you're mixing way too many dissonant sounds. But they managed to figure out an algorithm to layer it and it always sounds good.

And I've tried to break it. Whenever I play any of these experiences, I'm pressing everywhere and nodding to everything, trying to make it sound bad. And I couldn't. It always sounded good. So super cool.

So that's our developer portal. It Developer.Bose.com/BoseAR. That's where you'll find the SDK. I'm sure most of you have already gone on here and downloaded it. But I'll show you where all that is in a sec.

And the last kind of, like, but not least important thing that I wanted to say. So we have a new program that we just launched recently, and that's Bose AR Certification. So what Bose AR Certification is, is say you build a Bose AR app and you submit it for certification.

Our internal team will actually help you get across the finish line and publish the app to the App Store or the Android Store, making sure you have all the right requirements. And then if it does get certified, and it's within our brand guidelines, which again, we'll kind of help you get through that finish line, you can get a possible feature on the Bose Music app or the Bose Connect app. And you might even get featured on the Bose website.

So it's a really great showcase and a really great way to get noticed, especially as an early game developer. It's an early platform, so obviously we're looking for great content to feature on our portals. And if the content's great, and it's, again, passes the certification.

Which all these games that I've shown you, they're all in the process of getting certified and they'll all be available when you have your frames, for example, when you hook them up in the Bose connect app. There's actually a section there, Bose AR enabled apps. And then you click that and there's a directory there of apps. Yes.

AUDIENCE: So this works with Android now?

FILIP BABA: So it works on Android as well, yes, it does.

AUDIENCE: OK.

FILIP BABA: We had a little bit of a, like, earlier start with iOS SDK. So a lot of stuff was mainly iOS.

AUDIENCE: In July it wasn't working on Android.

FILIP BABA: Yeah, now we're rectifying that. There's going to be a lot of Android experiences coming out. Sadly, it's still, some developers just build for iOS. And we kind of have to prod them. Be like, hey, could you also build an Android build? And sometimes they need to be incentivized to do that. Other times they just do it.

If it's Unity, then most of these experiences are going to be on both. But sadly, that's just kind of how native app development is, they sometimes just want to work in one and not the other. But we are fully Android supported now as well.

And then we also have a Get Inspired page. This is a place where we have some case studies written. Golfshot's there, if you want to read about it. Dead Drop Desperado's also up there. I think OverHerd.

And it's just kind of a little interview with the developers. They kind of talk about their process of developing. And it's a good place to get inspired and be like this guy, really happy, like, ah. Yeah, that's basically the intro. I hope that didn't take too long. But yeah, just wanted to kind of give you a little bit of an understanding of what this is. So does anyone have any questions or no? Good.

All right, so now that we've gone through that, what I'm going to do here, I'm just going to close my Unity. And let's go ahead and actually fire up Unity. And we're going to get our a little beginner workshop started.

So you should all have a copy of the Unity package, of the Bose AR SDK for Unity, which you can find it on our downloads page. Let me know if you have trouble finding that.

We're going to be using this 4.0.1 beta. And when you download that, it's just a Unity package that gets downloaded. So I'm just going to let everyone kind of get to that. And the version of Unity that I'm going to be using is just the latest one. I have 2019.2.5F1.

And I have Android and iOS build support as modules in it. And I recommend that if you're going to be building to mobile that you have your desired platform modules installed. I'm just going to let everyone go ahead and do that.

AUDIENCE: Hey, should we tell them that-- is it still that iPhone is only VDM?

FILIP BABA: It's always a thing, yeah, it's just Apple and iPhone. But again, if you don't get a chance to build, you'll still be able to preview the app and play with it directly from your laptop. For most of this workshop, we're actually going to be sticking to the creative side of things, and concept, and trying to build an experience.

And then, like I said, towards the end, we'll try to make sure that everyone has built. And if you don't have the right device, I'm sure we have some extra devices. And if it comes to it, we could potentially also build or maybe share devices if needed. But once you have it set up once, then you're good to go. I have my iPhone here. I usually have to have both.

All right, everyone got their SDK? And ready to go. So what we're going to do is we're going to actually make a new project in Unity. And I'll just make sure everyone has this before we do that.

I'm just going to do the latest. And I'm just going to call this project MIT Bose AR Unity Workshop. And I'm just going to pick 3D project and create. Bless you. Hm?

AUDIENCE: Isn't it stable core?

FILIP BABA: Yes it is. I think I just missed a little bit in the beginning. All right, so I'm just going to switch to the default layout here.

Anyone not have their new Unity project open yet? All right, just take your time.

AUDIENCE: I'm downloading [INAUDIBLE].

FILIP BABA: Oh, well good thing the internet's fast.

AUDIENCE: I'm using an early version Unity right now.

FILIP BABA: How old? What's the version?

AUDIENCE: 15.

FILIP BABA: That's fine.

AUDIENCE: OK.

FILIP BABA: That's fine.

AUDIENCE: I'm downloading the new one.

FILIP BABA: Anything that's past the LTS, the 2017.1.4 should be fine. Anything with, like, an f at the end of it, it's final, not beta should work.

AUDIENCE: OK.

FILIP BABA: If for whatever reason it doesn't, I know sometimes there's been some edge cases. But it should be fine.

[INTERPOSING VOICES]

 

So when you do have this, I just want you to take that Unity package that was downloaded and just click it or double click it to open it. And you'll get an import dialog. And I just want you to import the entire thing into your project. Everyone imported?

AUDIENCE: Yeah, what do you click to import again?

FILIP BABA: You should just be able to double click the Unity package. And it should just pull up Unity. If for whatever reason that doesn't work, you can always right click in your project here and then Import Package, Custom Package. That also works as a backup.

 

AUDIENCE: Yours is a CSC file, right? I have an MCS file. So--

FILIP BABA: For what?

AUDIENCE: Just for the assets.

FILIP BABA: Oh, yeah. Don't worry about that. Everyone imported, ready to go? Give it another minute.

So yeah, when you import the Unity package, you get the entirety of the SDK. There's samples in here, sample scenes. There's a lot of cool stuff in here that I'll get into. And the setup itself is actually really straightforward and simple.

We've tried to make this as easy as possible to either put into a new project or integrate with an existing project. So if you're familiar with Unity, or if you've worked on a Unity project before, you'll find that a lot of the, I guess, the user flow and the way that the SDK works is typical, kind of to a Unity workflow.

So once you're all imported, you'll see these three folders show up. So the first one that I want you to look at is the Bose folder. So you go inside the Bose folder, there's wearable. And then inside wearable, there's modules.

You go to modules. And the first module that we want is connection. So we're going to go to connection and then prefabs. So that's Bose, wearable, modules, connection, prefabs.

And I want you to pull in this first prefab that's available here. So just drag that into your scene. You should see this white rectangle pop up. Now, the first thing that I like to do when I start a Unity project, especially if I'm developing on mobile, is I like to pull my game view, which is usually up here.

You might have a different layout. And I actually like to take the game view, and I like to make it visible. So I actually drag that game view over here to the right side of the project panel. So I always kind of have a little preview window here of what my preview looks like, basically.

So if you go ahead and do that, then you'll see that the panel that we just pulled in actually shows up here. And I'll explain what the wearable connect panel does. It's basically the entire menu that shows up that you can browse all available devices. You click a device, and you connect to it.

So this handles all of the connection for you. There's some cool features on here, like auto reconnect and such. So for example, if during your app, for example, maybe Bluetooth connection turns off, or for whatever reason the battery runs out, this panel will pop up again. It'll say, you've been disconnected, searching for device to kind of like keep the user experience clean.

So you have the wearable connect UI panel. And the next thing that we actually need to make the wearable connect UI panel work is we need an event system. So what I want you to do is I actually want you to just create an empty game object somewhere in the scene, and call it event system.

And on that empty game object, I'm going to Add Component here in the Inspector. And I'm just going to start typing, event system. There it is. It's a built in Unity thing.

And once you have your event system, there's a button there called add default input modules. So just click that. Now, I'll explain what this does.

This actually allows you to click the UI of the connect panel. Now, the reason why we do this separate is, maybe you have an already existing Unity project. And maybe you already have your own event system that you've set up, so we don't want to mess with that.

But since we're starting a new project, we're going to create this event system and add the default input modules. Like I said, that just allows you to actually click the UI and select the panel. So that's pretty much it. The last piece that we need to actually set up, like our entire Bose AR, is we actually go in. And we're going to create another empty object.

And we're going to call this one wearable control. Now, this will actually stream the sensor data back and forth from our app. So wearable control. And I'm going to add component.

Now, if you clear the search here, I'm just going to show you where the Bose components are. So when you clear the search, there is a Bose subcategory here. And you can click Bose, wearable.

And we have some sample scripts here that you can use. So the one that we want is wearable control. We're just going to click that. Now, one other useful thing that we added semi-recently is you can actually go up into here.

So if you go to asset, for assets, create. And then there's a Bose wearable category and there's an app intent profile. So I'm actually going to take that app intent profile. I'm going to put that in my top directory in assets, because I was in the prefabs folder here.

You don't have to, but it's just cleaner that way. And what you'll see here on the right is you'll actually see the sensors that are available to us. So for this first app that we're building, we just need the gyroscope. So I'm going to check the gyroscope on here.

And rotation, 6th off. And then the update interval, I'm going to make it, let's say, 40 milliseconds. That's all we kind of need in terms of sensors.

And what this basically does is it just kind of syncs up what the app requirements are and what the app is going to be using. So back to our wearable control that we created earlier. You'll see there's an active app intent profile empty slot right here.

You can actually just click that and just pull in that app intent profile that we just created. So again, that's App Intent Profile is in assets. Create Bose wearable App Intent Profile. In the App Intent Profile, you want to select Gyroscope, RotationSixDof, and 40 milliseconds. And then in Wearable Control, just put that App Intent Profile there.

The last step to get this to actually work is, where it says Editor Default Provider, it says Debug Provider. We're actually going to switch that to USB Provider. And what that will actually allow us to do is it will actually allow us to send the data over USB bridge directly to Unity, which is great for debugging while you're developing. So that's actually what we're going to be doing first.

So your Wearable Control has a USB provider. And that's pretty much it. So your app will actually now connect, both using USB and, if you built this to Android or iOS, you'll actually be able to pair it with Bluetooth. The only problem is we're not getting any of the sensor data. So we're not actually doing anything with it yet.

So the first thing that I want to do is, let's say I just want to visualize the head orientation on the screen. So how do we do that? So what we're going to do here is I actually want you to go back to the Modules. And in here there's another folder called Model Loader and Prefabs. So if you go to Bose, Wearable, Modules, Model Loader Prefabs, in here you'll actually see we have a 3D model of all of our Bose AR enabled wearables, which is actually really cool. And then there's also a default for if it doesn't know or maybe for a future product or something like that.

So the Wearable Model Loader actually pulls up the appropriate model automatically, so it knows which one you're doing. But right now let's just do it manually. So pull up the one that you have. So I have Rondo's here. So I'm just going to take the Rondo's prefab, and I'm just going to drag it into my scene.

And if I double click it, you'll see it's very small. So I would recommend to scale it up, let's say, to something like 40 by 40 by 40. So we just have our Rondo's like that in the scene.

And the last thing that I want to do is on this prefab I actually want to Add Component. And in here we're going to go to Bose, Wearable, and then there's a script called Rotation Matcher. So we're going to select that, and we're going to make the update interval 40 milliseconds. OK?

So what Rotation Matcher does is what the name implies. It will match the rotation of the wearable, so it will actually take the gyroscope data. And that's actually pretty much it. So this should already work.

So what we're going to do is we're actually going to test it out. So what I want you to do is I want you to take your frames. Hook them up with USB to your computer. If you're on Windows, you'll hear the connected sound. And I'm actually just going to Maximize on Play here so you can see what I'm doing. I'm going to press Play.

The picker is there. I have my frames I selected in the UI. It's going to connect to it. And there are my frames tracking their rotation as intended.

AUDIENCE: And this is all happening over USB, so if the cable disconnects--

FILIP BABA: Yep. It'll stop, yeah. So the idea is that this is for development. You're obviously going to be using the cable. But if you took this app and actually built it to your phone, it would work over Bluetooth.

AUDIENCE: OK.

FILIP BABA: So I want everyone to just get to this point, and then we'll move on to the next part. And what's really cool about this is we're already pretty much just a few steps away from doing spatial audio. Now that we have the head orientation, we can actually move our point of reference for the listener.

AUDIENCE: So when you plug in USB, does it automatically just detect it?

FILIP BABA: It should be plug and play, yeah. Does it work for you?

AUDIENCE: It's not connecting.

FILIP BABA: So make sure you have the Wearable Control script. So you skipped this step. So you have to have the Wearable Control script. So what you want to do is you want to create an Empty Game Object.

AUDIENCE: Oh yes, I do have it.

FILIP BABA: Yeah, wearable control. There it is. Click it. Oh, so event system--

AUDIENCE: Yeah.

FILIP BABA: Yeah. So On Press Play. Yeah, stop the app. And then Add Default Input Modules. There you go. Now you can press Play. And you should be able to click the UI at this time. So yeah, there it is.

So let's then try to update.

AUDIENCE: I mean, it might be faster just to swap it with another--

FILIP BABA: I think it's something that we've also been told to do, is people should know how to update the firmware. So if for whatever reason it's just not working, chances are sometimes that you just need the latest update.

So where you find that is if you just look up Bose Updater, if you just Google it, it's btu.bose.com. Just try to follow those steps. It's usually pretty fast. You just download the little tool, and it's all through online. So you just have it hooked up, and it'll pick it up.

So if I try it on mine here, it looks like this. I'm going to connect my frames. And see, it sees it. There's an update ready for my product. And you just click Update Now, and it'll do it in a minute.

If yours is working, you don't have to do that. But if for whatever reason it doesn't, that's usually the problem. It has an old version of the firmware.

If you need to know how to turn them off, you actually just flip them upside down when you put them on the table, and they'll turn off after a few seconds.

Bless you.

If you get the failed connection, usually reconnecting the device works. It's just a USB debugger thing.

So I'm actually playing the sound through my Bose frames. So when you connect over USB, you can actually drive the audio also through you USB. Now on Mac, you'll see it pop up in your sound settings. And on Windows, it's in the bottom right in the sound devices. You can set the Bose frames as your default audio device. So it's a great way to preview it with.

Now, what I did here is I placed the sound on this sphere. Now, this sphere is just in the middle of my field of view here. But in the next exercise, I'm going to actually show you how to spatialize that sound so you can hear it.

So I'm just going to wait to make sure that everyone's got this far.

AUDIENCE: Yes.

FILIP BABA: All right, so what we're going to do here-- there are a couple of ways to do this. So typically your camera is your frame of reference for audio. So as your camera turns, if you have 3D sounds in the scene, it's going to take the frame of reference from your camera to spatialize sound.

So what we want to do here is I'm just going to actually take the camera that's staring at the glasses. Now you don't have to do it like this. There's a few different ways to do it, but we'll just do it for simplicity's sake. We'll take the main camera, and we'll pull them into our frames.

Now, the way I have my scene set up is my frames are just at 0, 0, 0, and they're scale 40. And the camera is kind of like right behind them like that. So I'm just going to take the main camera and parent it to the frames so it's inside the frames. And if I press Play, I'll just show you what that looks like.

So now if you're turning, this is your frame of reference. So this simulates your orientation in space, basically. Right? So here's how this is going to work. So in Unity, doing spatial audio is actually really, really straightforward. So what I want you to do is-- and I already did this, so I'm just going to delete it so I can show you how I did it. It's very straightforward.

So it won't let me. What is it, command? There we go.

So what we're going to do is I'm just going to use a sphere to represent a sound object just for visual sake. So I'm actually going to create a New Game Object. So 3D Object, Sphere. And I'm going to place it at 0, 0, 0. I'm going to Add Component to this sphere, and I'm going to add an Audio Source. OK? So that's a built in Unity thing.

Now I take this sphere, and let's say this one I'm going to put it in front of the glasses, like here. OK? Maybe a little bit further. And the audio source an audio clip, if you click here, there are a few different sound samples that we have in our SDK. You could pick Chord LP or something. It's up to you which one you want to pick. This is just for the demo.

So we're going to pick a sound effect. We put it in front. And the only thing you've got to change is make it loop. So there's a little loop check mark here. So when it finishes, it'll start from the beginning again.

And then there's a dropdown here called 3D Sound Settings. I don't know if you see that. We'll get to that in a second. But the only one we want to change is right here. You see Spatial Blend? OK. So I'm actually going to drag Spatial Blend from 2D to 3D. And that's all you need to do to turn an audio source into a 3D audio source.

Now you'll notice that there are cool things here, like there's actually the range of the sound that you can tweak and stuff. So we don't have to do any of that just yet. We can just leave it as is. Just make sure it's 3D and that you have a looping sound effect. And that's all I want you to do right now.

So now if you press Play and you let it connect-- mine has failed connecting. If that ever happens to you-- again, it's just a known bug that we have with USB-- just reconnect. Make sure you're always saving your scene so no crashing happens. Unity tends to do that sometimes-- rarely, but usually it doesn't. So yeah, if you get the failed connecting thing, just reconnect and then try again.

So it did it again, so I'm going to unpress play. I'm going to reconnect. It's something to do with that it remembers the previous device, and there's a little bug with USB in that. And connected.

So when I turn my head, I can hear that sphere in front of me. And when I turn this way, I hear it here. Now I hear it on the left. So in order to get that, make sure you're driving your audio through the Bose frames. On Mac, you go into the sound settings here. And on Windows, I can also show you-- in the bottom right where the volume mixer is.

If it's saying connecting to Pixel 2, go into the Bose Connect app and to the settings, and you can actually remove any previous device that was there so it doesn't always do that auto thing.

So there is a little bit of a Doppler effect. I don't know if you can tell when you're doing that. Right? So these are all things you can actually change here in the 3D sound settings. So I think if you just turn off the Doppler, it won't do that, I think. So yeah, if you just go into 3D sound settings and you turn off the Doppler, you won't get that warp effect when your head turns.

That is an interesting effect that can be played with. But again, this is just the built-in Unity spatial audio. We actually have a couple of different ways you can do spatial audio.

So we've gotten to the point where you've set up your device, you've gotten a little bit of an intro, you've seen what the platform is about. And this is basically a first example of spatializing audio. Right? So now as an exercise what I'd like you to do for the next 30 minutes or so-- or let's say 20 minutes-- go online, find some sound samples, set up your own little spatial soundscape around you. And let's see how that goes.

AUDIENCE: So just loop them?

FILIP BABA: Yeah, make something cool. You can get some beach sounds, maybe water crashing on the side, birds chirping. Or it could be music samples. Maybe you get a drum loop.

I could show you a few. freesound.org is cool. I believe it's freesound.org. Yeah. You need to make an account, but it's free, and it's all free sounds. There are also a decent amount of websites that offer loops if you're going for a more musical approach.

So yeah, go ahead and do that. And let me know if you have any questions. I'll be going around and helping you out.