Midnight Stranger by Other Body Enterprises

To try out the latest “technical proof of concept” for Midnight Stranger, click on the image below!

A Mood Bar: red on left, green on right, blue in middle

This demo is meant as a way of getting a quick feel for what a “ported” Midnight Stranger would look like, and to prove to ourselves it can be done in HTML5. We started working on this before we went ahead with our Kickstarter, but it remained a work in progress throughout. Our Kickstarter didn’t make it’s goal, but it proved there was still plenty of interest in seeing the project though (thanks again to everyone who gave us such tremendous support and encouragement). As such, after a pause to recover from doing the Kickstarter (we all fried pretty hard), work has finally started again. Even this short set of clips demonstrates how one interacts with the characters on an emotional level through the use of the Mood Bar, rather than through some technical game-like mechanic. In the previous public version (v6) of the demo, Midnight Stranger-style mouse navigation was implemented, where the mouse cursor changes when it’s over a hotspot that you can click on (in this case, EXIT, TALK, and GO from left to right... only TALK does anything, but you can see the status text updated below when you click on EXIT or GO as well to show the functionality is there to support the needed functionality).

As of this version (v8), we have implemented our first attempt at a touchscreen interface as well! It supports drag and poke actions. The original Midnight Stranger mouse interface relied on changing the cursor to indicate when it’s over a spot that does something (a hotspot), but that is not useful when using a touchscreen (if for no other reason, your finger would be over it and you couldn’t see it as you moved). To that end, when using a touchscreen, a flag in the upper right corner indicates when your finger is dragged over or you poke at a hotspot. If you are dragging, when you release your finger, a circle is shown where you lifted from so that if you were over a hotspot, you can poke that same spot to activate it. Similarly, if you poke around looking for hotspots, a circle is drawn where you poke; and if you poked a hotspot, you can poke it again to activate it. The flag in the upper right corner will tell you if you were on a hotspot. If you drag to or poke on a hotspot and then drag or poke somewhere else (not on the hotspot just selected), then it forgets where you were and you start again. So, with the touchscreen there are two “modes”: searching for hotspots and, when a hotspot has already been dragged to or poked at, dragging to or poking it again will activate it. Note: on slower devices (which I have done a lot of my testing on), poking around is a better means of exploring a scene as dragging can have a serious lag (on the modern phones we’ve tried, there’s no lag at all).

We think it’s fun to explore the scenes, but in case you have a slow device or don’t share our thrill of the hunt, we have added a button in the upper left for both mouse and touch exploration that will will show or hide where the hotspots are located. Just click or poke at it and it will toggle between the SHOW and HIDE modes. Dashed red rectangles are drawn around the hotspots in the SHOW mode. This is also new in this version (v8).

For this little demo, in choosing to interact with the person in the frame, you get a video clip that introduces the interaction. You can then choose between three possible outcomes before being popped back to the beginning (hey, it’s a proof-of-concept, not a full demo, but at least it’s quick). It also shows some of the limitations of mid-1990s technology: the “talking head” video technique and the 640x480 graphics. The latter was so it could be run on as many systems as possible at the time, and the former was because on a CD-ROM (DVDs hadn’t quite made it onto the scene yet) you could only fit a few minutes of full-frame video, but much more than that was needed to make an engaging story, thus the small video superimposed on the background image. It’s like the clacking coconuts in Monty Python and the Holy Grail... after a while, you kind of forget they’re there and just get sucked into the story. One of the things that people often say about Midnight Stranger is it’s the authentic performances and engaging stories that are its strength, and that these overcome the technical limitations. The more recent production Stranger Still was able to take advantage of DVD technology and does use full-frame video. Future productions will be full frame as well.

As an important note: this effort is a work in progress <insert ironic 90s animated construction sign GIF here>, so the demo it points to will change over time as we add new features. A more important note: we test this on a few desktop browswers (Windows [not IE] and Linux, we don’t have any Macs on hand) and a couple of phones and tablets, but we know that it’s not going to work on many, if not most, mobile devices (phones, tablets, etc.). If this was a “done deed” we wouldn’t have been trying to get project funding to allow us to spend the bazillions of hours needed to put out a professional product. So, results may vary (wildly), and if you can try on a desktop system it will have the greatest chance of success (please feel free to drop us a line with the results you have on your particular mobile device!).

Video formats and encoding do remain an issue we are continuing to try to refine so they work in as many environments as possible (desktop and mobile alike). For instance, an HTC Desire 501 would initially run the application but not display the video. The browser said it supported MP4 and WebM video equally and the code picked the video in the order Ogg->WebM->MP4 (where it picked the best support in that order, or in the case of a tie, the first in the list). By swapping the order to WebM->MP4->Ogg (which is what it is set to now), MP4 was selected instead of WebM, and the whole demo ran fine. In another case, on an UbiSlate 3G7, it would not run in the browser that came with it, but it would run in Chrome or Firefox for Android on the same tablet. The punchline for the UbiSlate is that Firefox for Android indicates it has equal support for all three video types, but it really only works with Ogg. Luckily, the order swap that allowed it to work with the HTC phone also fixed it for this platform/browser as well. It won’t run on my personal LG-C800G (Eclypse) phone’s native browser (hey, hard keyboard is very important to me, I often have to SSH into various servers for maintenance when I’m on the run), and both Firefox and Chrome say my Android OS is too old to run on (I could fix it, but meh). Re-encoding the MP4 videos into the Baseline “Profile” fixed the video playing in a bunch of places, and the same was true when we re-encoded the WebM videos to use the VP8 rather than the VP9 codec.

The touchscreen interface has worked well the few modern Android systems we’ve tried running with either Chrome or Firefox. We finally tried it on an iPhone and were hugely disappointed: it does not support embedded video playing and pops up its own full-screen video player to show the video clips. This completely destroys the intimacy of the interaction and breaks (ignores?) the HTML5 paradigm. This has been a problem for a lot of developers and is well documented. This is apparently being fixed in iOS 10, and I there are some nightmarish hacks available that work to force video to be played inline by tricking iOS, but for now know that this will be suboptimal run on iPhones (it did work, just the video thing was a huge problem). We have not tried it, but it should work okay on iPads from what I read (again, please let us know if you try).

Obviously, there is still much to learn...