• When you click on links to various merchants on this site and make a purchase, this can result in this site earning a commission. Affiliate programs and affiliations include, but are not limited to, the eBay Partner Network.

STAR WARS : Episode IX December 20, 2019
6 6

2,429 posts in this topic

Just now, valiantman said:

Not at all. I don't question that businesses will always do what they think will earn them more (short-term) profits. (I do not believe most businesses think much about long-term profits, but that's a non sequitur to this discussion.)

BUT... to make the case that the math ALSO proves the manipulation, there needs to be a math proof.

In the case of 86% after 8,000 still being 86% all the way to 97,000, there is no math proof.  

There are likely dozens of articles asking, "how can this be true?" but the math itself says, "very easily, with 99.999%+ confidence."

You work for a third-party data aggregation service provider, right? See if they can get Rotten Tomatoes to turn over the raw results as part of Film Industry database dataset offers.

Meanwhile, all it would take is a 1% variance +/- of the overall population today to influence the results. That small a deviation. That's the simple statistical analysis reality (this morning's updates).

ROS_Example02.PNG.d4537f0aa11375d2dd396b8afcba2a52.PNG

:smile:

Link to comment
Share on other sites

5 minutes ago, valiantman said:

Now, if you want a really good conspiracy... :devil:

 

Big studios recognized they're getting caught manipulating results, so they have hired nerds like me (maybe even me!) to tell them how much manipulation is also statistically defensible.

:foryou:

You think I don't realize this? Remember, I dealt with Acxiom, LexusNexus and Mercury Analytics. How else would I effectively look up statistical analytics terms on Google otherwise? :shy:

And with Axciom (I believe you work/worked for them), there's a strong partnership there (Acxiom-Disney Data & Analytics Conference 2019). :smile:

:angel:

Link to comment
Share on other sites

I haven't read all of it, but I did get a "C" in my Statistics College course, I passed by the skin of my teeth.....

SO I'm educated enough to know, that they are....... indeed discussing statistics :insane: 

that is all.

Link to comment
Share on other sites

1 hour ago, ADAMANTIUM said:

I haven't read all of it, but I did get a "C" in my Statistics College course, I passed by the skin of my teeth.....

SO I'm educated enough to know, that they are....... indeed discussing statistics :insane: 

that is all.

Don't let the 'my stats mojo is better than your mojo' chatter throw any of this off. You're still my hero. :foryou:

Meanwhile, reflecting on just the 6 days of results, here's where we are at.

ROS_ratings04.PNG.5e8ba4725e80da69e59fc7ea6a2f3d6e.PNG

Reality, ROS as the 40-year franchise capstone event is trailing behind the other domestic results for the modern run based on same-day (Day 45 - Sunday). That's not good. And that's coming from the market that made Avengers: Endgame the dominant force as the biggest box office film in history. So the potential for a bigger result is massive when you consider this other 11-year franchise capstone event from Disney.

DC_MCU_BO200204b.thumb.PNG.cb62269b726573b23171f7bcd58b1caf.PNG

And based on all the indicators for audience reference whether a film is worth seeing or not, it is not a stretch to imagine they are being influenced by this (CinemaScore, MetaCritic, IMDb, Rotten Tomatoes).

DC_MCU_BO200204c.thumb.PNG.04988b10d0d45082820fe3964bcb5f1b.PNG

No advanced degrees, Google searches or tin hat propaganda necessary when reality is there in your face.

Edited by Bosco685
Link to comment
Share on other sites

3 hours ago, RedRaven said:

There are three kinds of lies,

  1. lies
  2. Damn lies
  3. Statistics used incorrectly

Fixed that for you.  (No, that's not a dig against Bosco, that's a correction to the famous "lies, damned lies, and statistics" quote.)

Link to comment
Share on other sites

5 hours ago, Bosco685 said:

You work for a third-party data aggregation service provider, right? See if they can get Rotten Tomatoes to turn over the raw results as part of Film Industry database dataset offers.

Meanwhile, all it would take is a 1% variance +/- of the overall population today to influence the results. That small a deviation. That's the simple statistical analysis reality (this morning's updates).

ROS_Example02.PNG.d4537f0aa11375d2dd396b8afcba2a52.PNG

:smile:

What we really need are the first 250 votes from the audience on IMDB and Rotten Tomatoes, then we need the next 20 votes each count all the way up to the first 510 or so of the audience votes.  That would show us some fluctuations.

The other thing we need are all 8,000 critic votes, but there aren't 8,000 critics, so we're not going to get that.

If we had the first 250 to 510 audience votes, we'd have something to compare against the first 250 to 510 critic votes.  As it is, we're comparing 8,000 to 500 and it doesn't matter how you chart it, that's not really valid either (but that's the industry standard, so what-are-we-gonna-do).

Link to comment
Share on other sites

26 minutes ago, valiantman said:

What we really need are the first 250 votes from the audience on IMDB and Rotten Tomatoes, then we need the next 20 votes each count all the way up to the first 510 or so of the audience votes.  That would show us some fluctuations.

The other thing we need are all 8,000 critic votes, but there aren't 8,000 critics, so we're not going to get that.

If we had the first 250 to 510 audience votes, we'd have something to compare against the first 250 to 510 critic votes.  As it is, we're comparing 8,000 to 500 and it doesn't matter how you chart it, that's not really valid either (but that's the industry standard, so what-are-we-gonna-do).

I think no matter points you make valid and noise, you are purposely ignoring all the other valid indicators which clearly confirm this film is not as strong as it should be. Indicating we need to perform sample analysis to validate the overall population is once again viewing these collection methods as a overall potential movie-goer sample survey versus a customer satisfaction survey. Different focus.

Plus, with that comment about just needing the first 250 votes to perform a sample analysis...

os_audience_change.png.7ddf216d092def32bae4f2e159ad2241.png

By Day 6, vote change was a 437.8% increase from Days 3-5. Those early results are now so skewed due to the drastic trending, the sample would be less representative from what occurred in those first 250 votes. And again, this is not a sample survey of the overall potential movie-goer population.

You are forcing yourself to stay committed to your original assumption no tampering occurred with the audience score. That's your reality. Attempt to detract from all the indicators all you like. You're off-base.

Edited by Bosco685
Link to comment
Share on other sites

4 hours ago, Bosco685 said:

Don't let the 'my stats mojo is better than your mojo' chatter throw any of this off. You're still my hero. :foryou:

Meanwhile, reflecting on just the 6 days of results, here's where we are at.

ROS_ratings04.PNG.5e8ba4725e80da69e59fc7ea6a2f3d6e.PNG

Reality, ROS as the 40-year franchise capstone event is trailing behind the other domestic results for the modern run based on same-day (Day 45 - Sunday). That's not good. And that's coming from the market that made Avengers: Endgame the dominant force as the biggest box office film in history. So the potential for a bigger result is massive when you consider this other 11-year franchise capstone event from Disney.

DC_MCU_BO200204b.thumb.PNG.cb62269b726573b23171f7bcd58b1caf.PNG

And based on all the indicators for audience reference whether a film is worth seeing or not, it is not a stretch to imagine they are being influenced by this (CinemaScore, MetaCritic, IMDb, Rotten Tomatoes).

DC_MCU_BO200204c.thumb.PNG.04988b10d0d45082820fe3964bcb5f1b.PNG

No advanced degrees, Google searches or tin hat propaganda necessary when reality is there in your face.

The reality of ROS. It can't be escaped. The film is not as strong as it should have been as a 40-year franchise capstone film. Disney Marketing is doing what it can to address audience detractors, including potentially tampering with the audience score.

:smile:

Link to comment
Share on other sites

22 minutes ago, Bosco685 said:

You are forcing yourself to stay committed to your original assumption no tampering occurred with the audience score. That's your reality. Attempt to detract from all the indicators all you like. You're off-base.

No, I'm suggesting that a true comparison would have identical counts for critics and audience totals, which cannot happen, will never happen, etc.

So, because all we have for critics is the first 250 through 500 critic responses, and we see fluctuation between 250 and 500, we would be able to check the amount of fluctuation in audience votes from 250 through 500 if we had them.

Comparing the fluctuations first 250 to 500 from critics to the fluctuations of the first 250 to 500 audience members would be something. 

 

Link to comment
Share on other sites

45 minutes ago, valiantman said:

No, I'm suggesting that a true comparison would have identical counts for critics and audience totals, which cannot happen, will never happen, etc.

So, because all we have for critics is the first 250 through 500 critic responses, and we see fluctuation between 250 and 500, we would be able to check the amount of fluctuation in audience votes from 250 through 500 if we had them.

Comparing the fluctuations first 250 to 500 from critics to the fluctuations of the first 250 to 500 audience members would be something. 

A 'true comparison' on a customer satisfaction survey would need to factor in a few things here with a long-running franchise with a fanatical fanbase.

1) For those providing early votes (first week), do we assume they fall into the 'fanatical fanbase' category and therefore their contributions will be skewed? CinemaScore surveyed those very same attendees, and on average they ranked it the lowest of the modern franchise.

ROS_Example03.PNG.e345d1d0066b06a928f774d9eabbad93.PNG

2) Do we have some form of screening to determine which are true movie-goers and not troll farm accounts that tried to impact early film results?

3) At what point did 'fanatical fanbase' votes transition over to 'general audience' more frequent votes? Do we have that as criteria to confirm word-of-mouth was not so impacted by that B+ CinemaScore?

See, data is on my side when it comes to gamesmanship taking place because there are too many indicators this film required Disney Marketing manipulation. And the box office results are playing that out for what should have been a massive wrap-up. 

Link to comment
Share on other sites

28 minutes ago, Bosco685 said:

A 'true comparison' on a customer satisfaction survey would need to factor in a few things here with a long-running franchise with a fanatical fanbase.

1) For those providing early votes (first week), do we assume they fall into the 'fanatical fanbase' category and therefore their contributions will be skewed? CinemaScore surveyed those very same attendees, and on average they ranked it the lowest of the modern franchise.

ROS_Example03.PNG.e345d1d0066b06a928f774d9eabbad93.PNG

2) Do we have some form of screening to determine which are true movie-goers and not troll farm accounts that tried to impact early film results?

3) At what point did 'fanatical fanbase' votes transition over to 'general audience' more frequent votes? Do we have that as criteria to confirm word-of-mouth was not so impacted by that B+ CinemaScore?

See, data is on my side when it comes to gamesmanship taking place because there are too many indicators this film required Disney Marketing manipulation. And the box office results are playing that out for what should have been a massive wrap-up. 

1) Even the "fanatical fanbase" early votes would include the "I-was-dragged-there-by-my-fanatical-friend" general audience.  I attended the first showing of the first evening the movie was released and at least half the audience was not drooling fanboys. lol

2) Nope, but I would be interested in putting real numbers to the vague concepts of "true movie-goers" and "troll farm accounts".

3) Since point 1) always included "general audience" votes, the transition away from the most fanatical of fanatical voters wouldn't necessarily change the audience score, since the most fanatical of fanatical voters seem just as likely to be "this was the worst movie ever" voters, judging by the first 2,000 responses in this topic.

The "perfect storm" for high audience scores is either A) a very low audience count with only votes from die hard supporters -or- B) a very high audience count where the majority of the audience expected nothing more than a popcorn flick (and received one).

If 100,000 voters have a "B+" CinemaScore but it is determined that 15,000 rated it a 10-out-of-10, is that better or worse than 10,000 voters giving an "A+" CinemaScore for a small movie no one saw?  It's debatable.  Percentage wise, the small movie wins.  The volume of people giving the movie 10-out-of-10 is automatically higher with the B+ movie with 15,000 voters giving 10-out-of-10.

Now you're out of statistics and into philosophy.  Should someone pay to see a big-budget movie that a lot of people do love (though many hate it), or pay to see a low-budget movie that almost no one saw but the small volume of "devotees" loved it?

(shrug)

If they were different prices, such as $15 a ticket for big-budget movies and $5 a ticket for small-budget movies, then we'd have a different statistical scenario because we could talk "return on investment" or risk analysis.

But, since theaters charge the same ticket price (as far as I know) regardless of the budget of the movie, we get into the philosophical question of whether the audience member can get a $12 value out of a movie they didn't really like (just from special effects or the spectacle of the crowd), versus whether they can get $12 in value from a small-budget movie that people who are into that sort of thing love, but a nearly empty theater, and who knows if the general public is into that sort of thing?

(shrug)(shrug)

At no point would I say we have definitive answers to any of these questions, so I'm against the general idea of putting a lot of effort into "proving" the unprovable with declarations of certainty for not only what the manipulation is, but also who is doing it, where they're doing it, how they're doing it, and what they're doing.  If we don't know if unicorns exist in the first place, we certainly don't know if blue-maned, eucalyptus-eating, Australian ranch-based unicorns are being manipulated by Disney Studios for fun and profit.  We might be able to say "probably", but we can't say "definitely". 

:popcorn:

Edited by valiantman
Link to comment
Share on other sites

4 minutes ago, valiantman said:
Spoiler

 

1) Even the "fanatical fanbase" early votes would include the "I-was-dragged-there-by-my-fanatical-friend" general audience.  I attended the first showing of the first evening the movie was released and at least half the audience was not drooling fanboys. lol

2) Nope, but I would be interested in putting real numbers to the vague concepts of "true movie-goers" and "troll farm accounts".

3) Since point 1) always included "general audience" votes, the transition away from the most fanatical of fanatical voters wouldn't necessarily change the audience score, since the most fanatical of fanatical voters seem just as likely to be "this was the worst movie ever" voters, judging by the first 2,000 responses in this topic.

The "perfect storm" for high audience scores is either A) a very low audience count with only votes from die hard supporters -or- B) a very high audience count where the majority of the audience expected nothing more than a popcorn flick (and received one).

If 100,000 voters have a "B+" CinemaScore but it is determined that 15,000 rated it a 10-out-of-10, is that better or worse than 10,000 voters giving an "A+" CinemaScore for a small movie no one saw?  It's debatable.  Percentage wise, the small movie wins.  The volume of people giving the movie 10-out-of-10 is automatically higher with the B+ movie with 15,000 voters giving 10-out-of-10.

 

Now you're out of statistics and into philosophy.  Should someone pay to see a big-budget movie that a lot of people do love (though many hate it), or pay to see a low-budget movie that almost no one saw but the small volume of "devotees" loved it?

(shrug)

Spoiler

 

If they were different prices, such as $15 a ticket for big-budget movies and $5 a ticket for small-budget movies, then we'd have a different statistical scenario because we could talk "return on investment" or risk analysis.

But, since theaters charge the same ticket price (as far as I know) regardless of the budget of the movie, we get into the philosophical question of whether the audience member can get a $12 value out of a movie they didn't really like (just from special effects or the spectacle of the crowd), versus whether they can get $12 in value from a small-budget movie that people who are into that sort of thing love, but a nearly empty theater, and who knows if the general public is into that sort of thing?

(shrug)(shrug)

 

At no point would I say we have definitive answers to any of these questions, so I'm against the general idea of putting a lot of effort into "proving" the unprovable with declarations of certainty for not only what the manipulation is, but also who is doing it, where they're doing it, how they're doing it, and what they're doing.  If we don't know if unicorns exist in the first place, we certainly don't know if blue-maned, eucalyptus-eating, Australian ranch-based unicorns are being manipulated by Disney Studios for fun and profit.  We might be able to say "probably", but we can't say "definitely". 

:popcorn:

I've actually provided plenty of actual indicators. Differing review site aggregators, CinemaScore results on first week reactions, box office results, articles proving how film studios have learned (and been caught at) combating Rotten Tomatoes when it is not in their favor, and even analysis of trend points when +/- audience score results would shift either way. And let's not forget, the actual critic reviews most recently posted which now shows they are working to pull such reviews when it shows a stronger negative pattern.

ROS_critics.thumb.jpg.e5aee0a8f918ad7299d087e437cbc362.jpg

With 'proving the unprovable' we are way past that now. There are more definites indicating what is causing Disney Marketing to take action. And it is taking action. Data is a-flow with that.

:smile:

Link to comment
Share on other sites

31 minutes ago, Bosco685 said:

I've actually provided plenty of actual indicators. Differing review site aggregators, CinemaScore results on first week reactions, box office results, articles proving how film studios have learned (and been caught at) combating Rotten Tomatoes when it is not in their favor, and even analysis of trend points when +/- audience score results would shift either way. And let's not forget, the actual critic reviews most recently posted which now shows they are working to pull such reviews when it shows a stronger negative pattern.

ROS_critics.thumb.jpg.e5aee0a8f918ad7299d087e437cbc362.jpg

With 'proving the unprovable' we are way past that now. There are more definites indicating what is causing Disney Marketing to take action. And it is taking action. Data is a-flow with that.

:smile:

In that case, you're the author and I guess I'm on the "peer review" committee.  We appreciate your presentation; it will be taken under advisement. :foryou:

Link to comment
Share on other sites

28 minutes ago, valiantman said:

In that case, you're the author and I guess I'm on the "peer review" committee.  We appreciate your presentation; it will be taken under advisement. :foryou:

The Peer Review is the box office results. Job is taken. But it needs an extra serving of 'more results'. Get your best waiter outfit on.

:nyah:

Link to comment
Share on other sites

1 hour ago, Bosco685 said:

The Peer Review is the box office results. Job is taken. But it needs an extra serving of 'more results'. Get your best waiter outfit on.

:nyah:

My hourly rate is too high for movie reviews. If I need to charity work, I'll be over at cgcdata.com instead.

Edited by valiantman
Link to comment
Share on other sites

As of Day 46 (2/3), ROS now experiencing larger drop-off than last three modern Star Wars films.

- 63.8% below Force Awakens

- 23.2% below Rogue One

- 12.2% below The Last Jedi

SW_7.thumb.PNG.3063fb42438140b9c25f8673fcbcc8ce.PNG

Best-case, we see a re-release with deleted footage.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
6 6