How does Netflix charge you

Nothing is left to chance on the Internet. Netflix suggests series that the user might like. Facebook suggests pages to him, Amazon Ware, Google Places and Tinder partners. While you can drift into bookstores and rummage through departments that are actually of no interest to you, while you come across unexpectedly attractive things while zapping through the television program or leafing through daily newspapers and magazines, the algorithm confronts you with increasingly specific recommendations. The user as a calculated and predictable being is quite transparent in his behavior.

The rating and recommendation system from Netflix shows the viewer a percentage with which a series corresponds to the viewing habits or the personal preferences determined by the streaming service. Under a series it says in green letters: "98 percent agreement." More than 80 percent of the series on Netflix are "discovered" through this recommendation system. Clicks, likes, shares. And the media usage behavior can also be broken down in more and more detail.

Of course you can say: It's great when algorithmic recommendation systems suggest a film. Then you don't have to search the media library for a suitable film. The deal is: entertainment fodder versus data fodder. And in the end everyone will be full?

It is not that easy. It may well be that the algorithm brings to light a previously hidden interest to the viewer that one would never have thought of himself. But the calculations and forecasts of the liking eliminate a variable that is elementary for the culture and science industry: chance.

Netflix leads according to a report by tech magazine Wired carries out 250 A / B tests on its users every year to find out how users react to changes in the program. With such an A / B test, as it is often carried out in online marketing, users see two different versions of a website. The flow of visitors is automatically divided into two groups: 100,000 users are randomly selected for the test, and a further 100,000 for the control group. On average, Netflix's data scientists found, a user probes 40 to 50 titles before watching a series. When a version gets more people to watch, that design will be implemented in the streaming service. Even small changes to the image can increase the views there by 20 or 30 percent.

Many inventions are due to chance: Teflon, penicillin, photography. Everything unplanned

On the Netflix Technology Blog medium it says: "By taking an empirical approach, we ensure that product changes are not influenced by the most opinionated and vocal Netflix employees, but rather data-driven, which allows our members to lead us to experiences they love." Basically, Netflix subscribers are paying subjects of constant market research. Dozens of variables are measured for each film: length of stay, last access, films viewed. The "Netflix Quantum Theory" breaks every film down into its individual mathematical parts. Todd Yellin, product manager and eccentric mastermind at Netflix, once said his goal was to "tear down the content." This shows the thinking behind it: It's no longer about content, but about formulas. From the viewing behavior, Netflix can deduce which series are an economic success - and which are more likely to flop. House of Cards was launched in 2013 with a decidedly algorithmic calculation: There was statistical evidence that the success of the previous BBC series with Kevin Spacey could be repeated.

In his book "What Algorithms Want: Imagination in the Age of Computing", the cultural and media scholar Ed Finn puts forward the thesis that algorithms are cultural machines in the sense that they generate cultural objects, processes and experiences: "Netflix presents a seamless one computerized facade, because we have come to a point where we trust the suggestions of a strange machine more than a stranger. " Netflix has to be read as a series of algorithms, interfaces and discourses.

An interesting post-structuralist thought: algorithms, which as the work of the programmer are cultural artifacts, always produce the same expected results that the audience is talking about. And the feedback in turn provides the impetus for new productions. A self-referential system. House of Cards, writes Finn, stands for one of the "most seductive myths of the algorithmic age": "The ideal of personalization, of tailor-made content that is assembled for each of us."

Certainly Netflix has established a new television culture. The only question is what it means for cultural production when deterministic algorithms decide what goes down well with the audience. What does it mean for film aesthetics? Will only series be produced in the future that promise a high level of "commitment"? How much culture is there in a data-driven culture industry, where watching a movie is not just watching a movie, but also a permanent marketing tool? How mathematisable and calculable can culture be, whose value is also fed by the unpredictable, unpredictable and unpleasant? With this seriality, doesn't the viewer end up with an ennui because nothing is surprising anymore?

There are plenty of stories from cultural and media workers whose creative ideas result from chance encounters. An article you stumble across while reading a newspaper in a café. A personality you meet quite by chance in a public place. The French writer Honoré de Balzac once wrote that coincidence is "the greatest novelist in the world". We owe many scientific inventions to chance: Teflon, penicillin, photography. Everything unplanned.

Serendipity is the English term for random observations of something not sought. Alone, no searching is non-existent in digital culture, apart from a few randomization techniques such as shuffle in the music track list. Every search must lead to a predefined goal. With automated systems that "think" developments from the end, chance and with it the creative element are lost. In the digital world, there is no more strolling, no accidental stopping or getting stuck on something. What we read, see, hear, eat and feel, decide and manipulate more and more often algorithms.

The US information theorist Christian Sandvig speaks of a "corrupt personalization": With algorithms that pretend to serve our interests, we would have created a system that primarily serves commercial interests that conflict with our own. The algorithmic recommendation systems ("You might be interested in this") corrupt users. What the computer computes as a result becomes the starting point for further searches, and so a feedback loop is created. The viewer is just a data package that is supplied with series, films and news that should be pleasing, expected, and affirmative. With a modern court theater.