People search for intelligence in cosmos. Can one find it on IMDb?

An impatient visitor can jump right now for the intelligently-reordered “canonical lists of films” made by other people, or to the lists of best films according to intelligent calculated grades, or to the lists of films I value a lot (at end). You will be able to return to this page when needed (the lists have links back to this page).

How to choose which film to watch, and how not to choose?

I find the opinions of film critics mostly worthless for this purpose. (Maybe they have a corporate ethics which does not allow them express themselves freely? Maybe they need to be heard over a chorus — and/or not laughed at, and this influences what they decide to say? Maybe their interests in films are so remote from mine? Maybe they want to cater to majority? — It does not matter — the raw result is that I see no use for what they say...)

Would you like to listen to what people on the streets say? If you are satisfied by that, then probably you do not need to continue reading this page...

The opinion of trusted (and tested) friends works much better — but it usually comes drop by a tiny drop, so this may be used only sparingly. There is a major collection of film reviews and ratings on IMDb; would not it be wonderful if this information was as useful as opinions of your trusted and tested friends? As experience shows, the apparent answer is “No way”: the opinions of individual reviewers go over the whole range, and it looks like without knowing these people first, one cannot decide which people deserve more attention than the others.

What about averaging the opinions of reviewers? At least it is going to be “a fair” grade: every film is judged by the same rules as any other film. Unfortunately, the same problem as in the justice system results: when one can ensure fairness, there remains only very little expectation of “justness”. The people you would like to hear are hidden by the opinions of the majority who you would like to ignore (but do not know how). The vivid example of this is the “IMDb Top 250” movies list: it is constructed by averaging the grades. Inspect the results: there is a certain amount of good movies there, but there is also a major amount of movies in the “better skip this” category.

This is the famous “designed by a committee” syndrom. Quite often a result of balancing opinions of several well-meaning wise people is much worse than what any one of these people would do alone.

The good news:

I've read information about 1000s of movies I saw, and slowly I realized that there is a certain system in this madness. The result? Now I can find in advance a certain “preliminary estimate” of whether a film deserves to be seen. This estimate works not worse (and maybe even better) than opinions of trusted-and-tested friends. (After viewing a film, I do not always agree with what the friends said when recommending the film; same holds for this “preliminary estimate” — but it feels that I agree with the estimate more often.)

What to do?

There are two sides of the story. First, given a “well-represented on IMDb” film, one should be able to find a “slice” of IMDb reviews which have a fairly good chance to contain intelligent discussion of the film. (Quite often, this discussion is a little bit on the rough side; but when one gets used to inspecting this slice, one should be able to compensate for this over-critical fervor.)

Second, one can extract a numeric grade for a film. Again, it is a “fair” grade, so it is not always “just”; but according to my observations, films with high grade very often are at least highly watchable. (It is the low grades which are more often wrongly-low — but again, not very often so.) This “intelligent grade” turns out to have very little correlation with the average IMDb grade...

Given that, one can solve a mass-problem: take a large collection of films, and order them using that grade. If one starts with large enough collection, one gets a lot of films with high “intelligent grade.”

Combining two sides:

The second approach gives lists of films with high “intelligent grade;” next, one can inspect the “intelligent reviews” given by the first approach. For me, this works like a charm: I found massive amounts of films with inviting review; watching most of them was a significant event for me. (Many of these films are “surprises”: films which I would never be able to choose basing on “a priori assumptions” which films “sound like good” or “sound like bad”.)

What I wanted to do next is to enable other people to enjoy this two-strikes approach. So I took some well-known “lists of best films”, and (re)ordered them using the “intelligent grades.” The films in these lists are equipped with links to “intelligent reviews” (and with tons of other info about the films). So this combines the opinions of critics (or laymans) who created the original lists with the opinion of the “intelligent slice” of IMDb contributors.

Additionally, I took a list of “all more or less notable films”, and took the best films according to the “intelligent grade”. This gives the second half of the lists below. These should be “the absolutely the best films according to the “intelligent slice” of IMDb contributors.

The dark side of the moon.

Above, I focussed on the bright sides of my approach. Of course, it has its dark sides as well; some of them may be worked around, and some not...

First of all, the algorithm to choose “the intelligent slice” of IMDb contributors is based on what I observed about people on IMDb. So, although it is “a fair algorithm”, it carries traces of my opinions on what is intelligent and what is not. Your opinion will undoubtfully differ from mine, so this “slice” may work for you not as good as it works for me. In short: the “intelligent grade” may be not good for you. (There is little I can do about it right now.)

Second: to have a trustful numeric grade, one must start with a large enough pool of opinions about a film. It turns out that good films do not necessarily win a large popularity on IMDb; so the pool for a good film may be quite shallow, and the resulting numeric grade may be subject to significant statistical flukes. A greedy viewer (like me) would not want to miss these good films; so one should better not omit films with shallow pools.

The problem comes from two factors: the fraction of films which deserve high ”intelligent grade” is very low; and majority of films have shallow pools of opinions. So it turns out that the legitimate high-grade films with a deep pool of opinions are “contaminated”: they are crowded by large amount of high-grade-by-a-fluke shallow-pool-films.

This happens if one mixes films with deep pool together with films with shallow pool. The solution: we use a new measure, “robustness”, which reflects the size of the pool of opinion; we provide ordered lists separately for higher-robustness films, and lower-robustness films. This avoids the “contamination” issue; now one should only be aware that the grade in the lower-robustness lists is not as trustful as in higher-robustness ones.

(About 340,000 films were inspected; we could assign a grade to about 150,000 of them. About 57% of these have low robustness [<0.5]; about 38% of the rest have robustness below 1.0. Of remaining 40,000, 88% have robustness 1.2 and above [as in the lists below], and 24,000 have robustness 2.0 or above, which is reasonably high.)

Third, there is a small but noticable fraction of “false negative” calculated grades given to “obviously worthy” films (for example, inspect the end of the Top250 re-ordered list). I suspect that the algorithm I use cannot distinghuish a “worthy but controvercial” film from a non-worthy one. Again, I do not know a remedy for this dark side. (To summarize: if the grade is high [above 7.5], I find that an overwhelming majority of such films is “not a loss of time”. If the grade is low, and you know OTHER STRONG reasons to expect that the film is good, do not let the low grade affect your expectations too much.)

Anyway, if a film requires a certain work to obtain, AND I do not have a special reason to want it, and the “intelligent” grade is below 7.0, I usually do not bother to see it. There are THOUSANDS of films with (robust) grades above 7.0, and I expect that an overwhelming majority of films with robust grade above 8.0 (about 2400 of them!) are going to be a significant experience. So why bother about cinderellas?

Now what?

What is the possible future developments? If one inspects the “intelligent” grades soon after a film is released, and at later times, there may be a certain interesting dynamic of the grade. For many films, the grade is significantly higher when they are young, and the grade goes down to much more appropriate values later. Currently, this makes the grades of “young” films less reliable. One may want to investigate the “natural laws” of this dynamic, and try to correct for it. (Such a correction would make the intelligent grade much more useful for “young” films.)

When one accumulates the experience of observing the “intelligent” reviews choosen by the algorithm, one finds certain similarities between the style/spirit of these reviews. As one learns recognizing these details of the style, one becomes able to pinpoint such “reasonable” reviews even in cases when the algorithm gives up (when the pool of reviews is so shallow that the algorithm is not able to select “intelligent” reviews by itself). This way, by perusing results of the algorithm, I found that my own ability to navigate IMDB reviews improved a lot (but it took several years).

(Re)ordered lists

First of all, one can inspect whether you like the “intelligent” grade better than the “normal”, averaged IMDb grade. Below are several notable movies lists; choose one of the lists, and compare two ways to order the list, “Intelligent” and IMDB’s:

List source:

Sorted by grade:

Notes:

IMDB’s Top250:

“Intelligent” IMDB’s as of July 2014

IMDB’s Top1000:

“Intelligent” IMDB’s as of July 2014

1001… must see before…die:

“Intelligent” IMDB’s Edition of October 2013; see also the electronic version (another version)

NYT’s 1000 movies:

“Intelligent” IMDB’s See also this version on RinkWorks

Jonathan Rosenbaum’s 1000:

“Intelligent” IMDB’s See also this version on IMDB

David Thomson’s 1000:

“Intelligent” IMDB’s See also this version on IMDB

planktonrules’s 10/10s:

“Intelligent” IMDB’s See also his homepage and some analysis. These films are not counted in the totals

Criterion Collection:

“Intelligent” IMDB’s See also this version on RinkWorks

Christmas movies

“Intelligent” IMDB’s The list created at end of 2013

(The “1000-files” are about 2.5MB large; they may take a few seconds to load in the browser.) By the way, there are about 3930 films with GR-grades in these lists…

(One way to inspect things is to look for the films you know. If these films appear high in the sorted list, check whether you consider them “worthy”. See which of two ways to sort matches more your notion of “worthiness”. However, this should better be restricted to the top half of the lists. Why? Unfortunately, since the films in these lists are already “notable”, and the algorithm we use gives a noticable fraction of false negatives, the films on bottom of “intelligently sorted“ lists have a significant fraction of false negatives.)

What if you do not see the improvements? Then you can still try to inspect the links to “intelligent reviews”; do you find them useful? Even if you do not find these reviews appropriate, you still can use the additional information the lists are instrumented with. Anyway, I would like to hear your feedback on these issues.

Lists of films ordered by the “intelligent” grade

Taken together, the lists below contain about 5510 films (with the reordered lists above, it makes about 8710 films with GR-grades). For the explanation of the terms, read this page from its beginning (or just start from the upper left entry of the table, then inspect its neighbors etc).

“Intelligent” grade:

Robustness:

8.7 or above:

3.5 or above 2.0..3.5 1.5..2.0 1.2..1.5

8.1 .. 8.6:

2.0..3.5 1.5..2.0 1.2..1.5

7.7 .. 8.0:

3.5 or above 2.0..3.5 1.5..2.0 1.2..1.5

7.4 .. 7.6:

3.5 or above 2.0..3.5 1.5..2.0 1.2..1.5

7.2 .. 7.3:

3.5 or above 2.0..3.5 1.5..2.0 1.2..1.5

Similar lists with only Feature, non-Adult, non-Animation films:

“Intelligent” grade:

Robustness:

8.7 or above:

3.5 or above 2.0..3.5 1.5..2.0 1.2..1.5

8.1 .. 8.6:

2.0..3.5 1.5..2.0 1.2..1.5

7.7 .. 8.0:

3.5 or above 2.0..3.5 1.5..2.0 1.2..1.5

7.4 .. 7.6:

3.5 or above 2.0..3.5 1.5..2.0 1.2..1.5

7.2 .. 7.3:

3.5 or above 2.0..3.5 1.5..2.0 1.2..1.5

See also list of films that I value:

ordered by “intelligent” grade, ordered by average IMDB grade. (There are about 350 films with GR-grades; this brings the total up to about 8850 — or about 9125 including films without GR.)

Older version(s) of this page. (The lists done in July 2014; links renewed in Dec 2014).