Torrenting is the New Black?

orange-is-the-new-black-netflix

Yesterday, Netflix released the eagerly anticipated trailer for the fifth season of its original series, “Orange is the New Black.” Unfortunately for the company, almost the entire season is already available online, thanks to the work of a group of anonymous hackers by the name “thedarkoverlord,” who pulled unreleased episodes from the post-production company Larson Studios’ server in April. Since then, Refinery29 reports that summaries of ten of the thirteen episodes have appeared on the show’s Wikipedia page, making spoilers accessible even to people who do not wish to torrent the episodes.

Yet, despite thedarkoverlord’s demands for ransom payment and threats to other networks (pictured below), Wired writer Brian Barrett argues that the leak “wasn’t worth paying even one cent to prevent.” In Barrett’s eyes, there will always be people who illegally torrent shows, but Netflix has a high enough number of paying subscribers (98.75 million) to make up for that minority. Moreover, the amount of people torrenting content has dropped drastically in recent years; whereas BitTorrent “accounted for 23 percent of daily internet traffic in North America” in 2011, “that number sat at under 5 percent” in 2016.

Screen Shot 2017-05-10 at 12.53.50 AM.png

For me, this entire situation is fascinating on multiple levels. For one thing, the leak reflects just how much TV technology has changed, even within my own lifetime; whereas once you were at the mercy of scheduled re-runs to catch up on a missed or re-watch a beloved episode of TV, now we have the freedom to peruse seasons of our favorite series at our leisure and, in the case of the OITNB leak, sometimes even ahead of their intended release. The Internet has allowed for a sort of instant TV gratification that would otherwise not be possible. Sure, you could go out and purchase a boxed set of a specific season of television, but you would need to allot time for the season to air and then some, whereas many streaming services today release entire seasons of a show at once, as is the case with Orange is the New Black.

Yet, despite this culture of instant gratification, the amount of people torrenting content illegally is dropping. This was perhaps the most interesting part of the article for me, as a poor college student who is all too familiar with determining the most diplomatic way to ask someone for their Hulu and Prime account information. I would have thought that as average people become more internet-savvy and popular shows are spread across so many different costly platforms, more people would say, “Screw it,” and pirate films and movies from services they don’t subscribe to. This information was shocking to me, but as an aspiring television writer, I am obviously pleased. The only thing better than having a show good enough for thousands of people to pirate is having a show that’s worth paying for.

oitnb good.gif

Barrett’s article can be found at the following link: https://www.wired.com/2017/05/orange-is-the-new-black-leak/

 

Advertisements

Award-Winning VR Video….and a Jam Band

One of my favorite jam bands (sorry) has done something super cool. Umphrey’s McGee recently teamed up with a company called Reel FX, which is a digital effects/animation company that is fortunately located here in Los Angeles (#notvancouver), to produce a 360° degree VR concert video! Reel FX filmed the band playing their song, “Puppet String,” at the House of Blues in Houston last spring. And this spring, the video took home a Gold Remi at the 50th WorldFest Houston International Film Festival. Reel FX took home the top prize for Music Video Craft Awards for New Technology.

The resulting video is a really neat experience that you should all check out. Be sure to watch it on Chrome or Firefox to be able to use the 360° function! You can drag the screen around with your mouse to see what’s going on everywhere in the room. You can even check in on the sound guy, read the setlist at the band’s feet, or check out their equipment rigs.

Reel FX filmed the band using several different mounts with multiple GoPro 360 cameras attached. So basically a less-USC version of the camera we saw in the Jaunt Lab the other week. What’s really interesting is when there are cuts to different cameras (some are closer to different band members on stage while there is also a view from the front of the stage and the ceiling above the audience). The video seamlessly transfers from one shot to the next, regardless of what the viewer is currently looking at. It’s downright impressive work. And the stitching is immaculate as well – I never once saw a mount, camera, or part of the room that looked weird in the entire 12 minute video.

This is an extremely exciting step forward for recording live performances as it allows you to experience the space the way the concert attendees did. It also gets you closer to the artist than you normally would be. Oh, and technologically, it’s just stunning.

You should also take a quick look at Reel FX’s website. They do super cool work!

On Nintendo Cartridges

One thing I really appreciate about this class are the visits to the archive to see and experience technological pieces of media creation and consumption technology from the past. Part of the reason I enjoy these visits are due to the nostalgia I have for the media I interacted with in my past.

One of these are RAM cartridges for Nintendo and other game consoles. Now largely obsolete, cartridges have been replaced by CDs, DVDs, and now internet mediums of transferring data. As far as my understanding, the technology of RAM cartridges is similar to that of a memory card, so in this strange way, I interact with this technology on a daily basis from the perspective of a media creator today, while I did so primarily as a consumer when I was a kid.

One thing I’m sure everyone remembers about these cartridges is their propensity for not working when turning the system on. In order to fix this, the common technique was to remove the cartridge, blow on it with your mouth to “remove the dust” or some such, and to reinsert it, then turn the console on. As this typically worked, it was considered a solution.

Here is an article about the technology of the RAM cartridge and the phenomenon of blowing on their connectors to remedy issues with the game: http://mentalfloss.com/article/12589/did-blowing-nintendo-cartridges-really-help

According to this article, the act of blowing on the cartridge was ultimately a placebo and did not help the bad connection: simply removing it and reinserting it solved that. In fact, the article goes on to say, blowing on the connector was detrimental as it opened the opportunity for corrosion of the contacts.

Despite all this, I still have really fond memories of blowing on my Nintendo games, as it allowed for a greater sense of interactivity than provided by the gameplay. As a kid, I was fascinated by the physical nature of the electronics technology that allowed for gameplay (media delivery) and I enjoyed engaging with it physically and tangibly. It gave me a sense of purpose and pride in troubleshooting a technical problem.

To me, this is fascinating. Part of engaging in playing Nintendo games as a visual medium was this ritual of overcoming the limitations of its delivery technology. A single work in media can engage people in both virtual and physical realms with different modes of thought simultaneously. By blowing on these cartridges, people are engaging with media and technology in a way I had never considered before.

Thoughts on a Cinematic Installation Show

Hey guys,

Today I attended a show called Lush Machines in SCA, an exhibit of the Cinematic and Media-based Installation class in the animation division.

In addition to being impressed by the quality of the work, I was also very intrigued by the potential for the Media Installation medium, as I have only rarely experienced such exhibits. In many ways, they interacted with the audience in ways similar to VR as we discussed in class in their immersion, interactivity, and experientiality.

However, all of the installations strongly featured a physical quality to them, blending media (audio, video projection, software construct, and/or virtual reality) and physical interaction holistically. For me, this was very impactful, as it allowed for a very information- and meaning-rich experience.

Below are photos of one of the exhibits I enjoyed.

Can AI Really Create

So We’ve seen the trailer of film Morgan edited by AI, which look pretty professional.

But have you seen this?

 

Sunspring- A sci-fi short wrote by AI “Benjamin”. Benjamin even wrote the theme song for this short.

AI is like VR right now. People are still trying to figure out what they can really do with it. But it will be a sure thing that AI will be totally being able to take over a lot of the human works.

But can AI really create?

Although “Benjamin” was able to analysis as many Sci-fi films as possible. What it really did was still searching in its database, finding a line and putting into the script either randomly or by percentage. In the short, it was very obvious that the script doesn’t make any sense. It was more of a combination of different elements from different films. But it does not really have a heart. Same to the trailer edited by AI, It was all about data analyzing.

Film is a very complicated media. It’s a combination or evolvement from many other medias. It’s visual and verbal. It’s a lot about rhythm, emotion. It is hard to calculate the data of how a film is made. There is not a definition of what is scary, what is sad. In the film No Country For Old Man. The killer made you feel threatened even when he is sitting on the sofa. If we just analysis the visual, the color, the script. It won’t be a typical scene that an AI can pick up. But it works efficiently.

AI might be good at executing instructions. But in terms of making the right choice for a certain situation creatively, it’s more of a challenge. Thus why people still need supervision to control the creative part, no matter if it writing novels or making trailers. With out it, the outcome might not be as good.

Pre-Facebook 360-Degree Video: Disney’s CircleVision, Or How I Fell In Love With China

It’s happened on numerous occasions lately, I wake up, and the first thing I do in my groggy, half-aware state is reach for my phone. Facebook is usually one of my first stops, and starting in 2015 (http://www.popsci.com/facebook-and-oculus-are-bringing-360deg-video-to-your), a new feature made confused, early-morning me even more confused. I was looking at a video of some event, live-streamed. Much to my shock, as I moved my phone in my hands, the video appeared to move. I realized that I could hold the phone in any direction, at any angle, and see what was going on at the event – I had full, 360 degree views of this stream. When I realized what I was seeing, I wondered – how is this possible on an iPhone? On Facebook? And it’s streamed live? It reminded me of my first exposure to 360 video, one of a different kind. This being the CircleVision technology at the World Showcase Pavilion at EPCOT, in Walt Disney World in Orlando, Florida.

World Showcase is a series of experiences where park-goers get to experience different countries’ landmarks, food, customs and cultures in various pavilions. Some countries represented include Canada, Japan, Mexico, Norway, and China. Canada and China’s pavilions both boast CircleVision experiences, a large format movie presentation technology used for the attractions O, Canada! and Reflections of China, which both run all day in their respective pavilions at World Showcase.

circle360

Both take place in large, rounded rooms, with nine screens mounted high on the wall in a circle. In between each screen is a projector aimed at a screen across the room, and because there are screens all the way around the room, the CircleVision films are viewed while standing up. There are railings along the theaters for viewers to lean against, and in some cases, hold on to (it’s easy to get dizzy when spinning around to try and catch what’s happening on the screens all around you). CircleVision is Disney’s version of circular viewing, first appearing at the 1900 Paris Exposition, that technology being Cinéorama. After this, there was Krugovaya Kinopanorama in Russia, using 11 cameras (https://en.wikipedia.org/wiki/Circle-Vision_360°#Earlier_systems).

The difference between the CircleVision experience and the 360 videos of today in VR and on Facebook is how the viewer interacts with the viewing environment. With CircleVision, the video content is in a fixed position, projected all around the wall. With VR and Facebook, the viewer controls the placement of the screen and the viewing angle of the content. The CircleVision viewer still has control of what he or she sees, but that decision consists of which fixed screen to look at, and the viewer can’t move the screens, or what’s on them. In both cases, the viewer has agency and thus, viewing these two varieties of 360 degree video is more of a participatory act than a spectatorial experience.

I vividly remember my first CircleVision experience – it floored me, and was my first non-traditional film-viewing format. I first saw it with Reflections of China at a young age, and before then already had a massive interest in China. After that first time, I was riveted, and thus began a true love of, and fascination with China. Being surrounded by the Great Wall, the Yangtze, and taking in the massive height of the Three Gorges Dam in the CircleVision format, all images larger than life and fully immersive in their presentation further ignited a passion to travel to China, a dream I’ve held close to my heart ever since, and a dream that came true in May of 2016, when a class brought me to four cities in China over the course of two weeks. I was able to stand on the Great Wall and take in the views I’d previously seen in CircleVision, when I hoped I’d one day get to see the real thing. The visceral impact that CircleVision had on me is proof of the magnitude of experience that 360 degree video, paired with large format projection, can provide. While 360 video on a smartphone is cool, there is certainly merit to presentation in the theatrical environment, and while proliferation of such technologies for the home is on the horizon, we cannot forget about what theater viewing environments offer, something that home viewing never can.

Shopping Malls Become VR Hotspots

WOW.

So after the mind blowing conference we had earlier this week and the exploration of reality and virtual reality, I decided to fall even deeper into the rabbit hole and I realized that virtual reality has not only invaded classrooms, but also, our shopping malls.

That’s right, virtual reality has been used to re-invigorate shopping malls that had a decline in attendance, using the empty retail space for VR experiences. According to an article from TechCrunch, The Gateway Mall in Salt Lake City, Utah, was created for the 2002 Olympics, and was once a bustling hub for shoppers. Ironically with the rise of online shopping, customers are spending more time buying from home instead of going out to the malls.

Ryan Burningham, founder of The Void, made the decision to move his VR facilities to the mall for it’s low cost. According to The Void, it’s virtual reality that is truly immersive, involving things that you can feel.

SCAFFOLDING

They even have a ghostbusters game where you go around blasting ghosts with your proton streamer.SLIMER

And they say that it’s an interactive interface, giving you real-world objects that correspond with the ones in the game for you to interact with. On Wednesday, I briefly mentioned haptics, and man, does this deliver. SPLIT SCREEN

They even designed a specific gun for The Void, which is part of their Rapture series, to help users get a feel for the weapons they use in game.RAPTURE MARK IV GUN & HMD

Users can also team up with one another and enter the same game, with up to four people in each game. So family and friends can take on ghostbusters together, or search through secret tunnels. While we mentioned the total isolation you get with VR and lack of community, it’s interesting to see that they’re trying to incorporate a multiplayer aspect to the whole experience.

This team has gotten rave reviews on The Void experience, and are working on expanding the experience to other locations.

THE VOID co-founders

James Jensen, Curtis Hickman, Ken Bretschneider

The Fight for Film

Lately we’ve been talking about the transition from film to digital, and  I recently came by an article that said that The Lost City of Z, (a new movie about a hidden city in the amazon) was shot on 35mm and that the footage had to be flown from Columbia.

lostcityofzhunnam1

Having watched the trailer, I do see how a lot of the colour of the Amazon comes through on the film stock. But for James Gray, the director, he felt the need to shoot on film in order to achieve a certain look– but that came at the cost of production. The movie was shot on location, in both Columbia and Peru and Gray stated in an interview, “To be candid about it, a certain madness kind of sets in. There’s no way to avoid it.” With scorching temperatures and a lack of hot water, shooting on film was an added challenge. Lead actor Charlie Hunnam had to take a day off of filming because a bug had crawled into his ear and he couldn’t remove it. Talk about tough. It also added $750,000 dollars to the production budget.

LCOZ_5094.CR2

James Gray said that it changed the process for him too, and in a way he was experiencing the story through Percy Fawcett, the real-life adventurer the movie is based on, because he had no way of seeing the footage immediately after it was shot, and that it became an immersive experience. But, he did get shots like this:

0466400

lost-city-of-z-charlie-hunnam-tom-holland

These shots feel (at least, to me) like they benefitted from film. The hazy look, the slightly grainy texture, all create an image that makes you feel the heat, the dampness of the jungle, the tension in the air. But, it does beg the question: is it worth it? With the progression of digital, our capability to manipulate an image whether it’s in camera (with a lens, gel, or filter) or in post (through colour correction and editing) is reaching a point where we can (sorf of) recreate the effects of film. It’ll be interesting to see where it goes.

 

From “Tangerine” to the style of independent film

When I was at an interview, I said when I did my undergrad, the style of films we appreciated were very different from what USC likes. USC is more industrial with a Hollywood style. But at the time, we liked independent style better. Then the interviewer asked me with a questioning look: Can you describe what is the independent film style you mentioned? I was so nervous after hearing the question, thinking I said something wrong.

I do think there is a style of independent films. Sometime the style came without a choice, but sometimes they were choices made by the director. After watching Tangerine last week, I confirmed this thought.

Independent films, most of the time will be compared to Hollywood film by the audience. It normally means films that are not produced by major Hollywood studios, sometimes art films, experimental films.

With the appearances of cheaper digital cameras, non-liner editing softwares, sound mixing software and PC. Making a film becomes more and more easier.

Using non-stars, telling stories of normal life or social issues,showing the mundane details of daily life with low budget, independent films has its unique brand. Films this year like Manchester By The Sea is an example.

This slideshow requires JavaScript.

As a result of less budget, independent films tend to use natural light, hand held camera or steady long shots, long dialogue or very limited dialogue.

Because of the usage of smaller cameras and limited lighting. Indie filmmakers are able to show what a real life looks like, bringing the film closer to the audience. Just like what Tangerine did. It shows the real Los Angeles which is very different from the glory DTLA in other films, thanks to the small iPhone 5s provides more instant reaction from the filmmaker with a smaller film set.

This slideshow requires JavaScript.

I believe the difference of the style was determined by the different usage of film technologies at the first place.

VFX Not Enough to Redeem “Ghost in the Shell”

making-of-ghost-in-the-shell-1

After months of closely following the controversy surrounding the whitewashing of the lead character, I was eager to see if Rupert Sanders’ adaptation of Ghost in the Shell would fare upon its release. I myself watched the film last week in Film Symposium, and while I was not terribly impressed by Scarlett Johansson’s performance or the thin narrative, I believed that John Dykstra’s gorgeous VFX would find a sizable audience for the film…

I was wrong. According to the linked article from Deadline, Ghost was a total flop at the box office and, according to the author’s calculations, will lose at least $60 million. The article cites poor marketing, the lack of a “hands-on executive,” the controversy surrounding Scarlett Johansson’s casting, and an “exorbitant cost” for such a niche project as reasons for the film’s failure.

All of this makes perfect sense to me, though the phrase that stood out to me the most was the author’s assertion, “You need a story to sell well beyond the visual shock and awe.” In a time where action-packed blockbusters that are devoid of strong narratives seem to thrive (see: Transformers, Suicide Squad, Fast and Furious, etc.), I would have expected exactly the opposite to be true; as much as I hate to admit it, it often seems as if the majority of domestic and international audiences prefer big explosions and other digitally-enhanced visuals to nuanced, slow-burning character development.

Of course, part of the problem with this particular story is not just that it was thin but that it was based on a manga/anime franchise that is relatively obscure in the U.S. I was very intrigued by the author’s observation that it was irresponsible of Paramount to spend so much money on a “niche IP.” Of course, visual effects only comprised a fraction of the $100 million plus cost of the film, but I was fascinated by the idea that the familiarity of source material dictates, in part, the amount and quality of VFX in a film, especially in light of last week’s in-class discussion of the relationship between technology and economy. It is mind-blowing to think that an interesting concept alone is not enough to dictate the budget of a film and, by extension, the amount spent on special effects; not only was it a bit of a risk to invest so much in a niche project like this, but this article suggests that it was downright irresponsible.

Above all, I am shocked (yet pleased) that, this time, the too-familiar problem of a white actor being cast in a non-white role weighed out the special effects work of no less than John Dykstra of Star Wars, Star Trek, and Battlestar Galactica fame. This gives me hope as both a writer and self-proclaimed activist for onscreen diversity. Despite our current political climate (ahem) or perhaps because of it, the backlash to Scarlett Johansson’s casting as a Japanese cyborg-woman helped destroy what looked and felt like a lucrative blockbuster film. This aspect of the film’s failure, too, is a fascinating example of the relationship between film technology and broader social issues such as race, gender, and class. Though, whereas we’ve previously discussed how “racist” technology can be in class (i.e. 3D glasses), this is an example of how technology failed to mask the problematic depiction of race in a contemporary blockbuster.

Overall, Ghost in the Shell might be a box office failure, but it’s a fantastic case study in the many factors that can impact a film’s success as well as film technology.