• When you click on links to various merchants on this site and make a purchase, this can result in this site earning a commission. Affiliate programs and affiliations include, but are not limited to, the eBay Partner Network.

Superman2006

Member
  • Posts

    1,949
  • Joined

  • Last visited

Everything posted by Superman2006

  1. I guess I'm color blind . All jokes aside, I think if this same book were to be graded now, it would likely receive a green label From what I understand, if you have something that qualifies for a Green label, you can ask CGC to give it a Green label with a higher grade, or a Blue label with a lower grade. In the case of this book, without seeing the back cover, and interior, I don't think a Blue label grade of 2.0 is unreasonable. Alternatively, if it was given a Green label, I could see it getting a 4.0 (or higher).
  2. Ha ha, yeah, be sure to join the next one, so you can retain your MVP title! I'll probably increase the minimum number of contests to qualify for the updated rankings through contest #7 to 3.25 contests (Not everyone could participate in the 4th round of the 1st contest, so the max contests most participants could have participated in through contest #7 will be 6.75, and 6.75 / 2 = 3.375, which rounds down to 3.25 contests).
  3. Just a reminder that by only listing the top 50 ranked graders according to the requirements outlined in my posts, my intent isn't to exclude anyone from the rankings. I just don't want to anyone near the bottom of the rankings to feel bad about their average scores, so I cut off the rankings pretty high up (i.e. just the top 50 out of the 99 participants that have competed in 2.75 rounds or more out of the first 6 contests). For anyone else that's interested in their scores outside of the top 50 (whether you've competed 2.75 rounds, or not), don't hesitate to shoot me a PM, and I'll share your scores, ranking, etc. with you via PM.
  4. Here is my last planned ranking summary through contest #6 (I may post some other analysis at some point, such as most improved). This post just provides a consolidated view of the various rankings that I analyzed, sorted in order of the CGC ranking from my first post in this thread. The first two screenshots below show the same top 50 as in my first post. The 3rd screenshot below shows some additional contestants that are in the top 50 based on one the two alternative "wisdom of the crowd" ranking systems that I described in my posts above. Congrats to @spidermanbeyond @Black_Adam @Rumler @jcjames @frozentundraguy @PeterPark @Ron C. @Stefan_W and @Gman65NJ for top 50 grader finishes based on the wisdom of the crowd ranking systems shown in the last 2 columns below (and described in more detail in my posts above)!
  5. The rankings based on collective averages in my posts above follow Aristotle's "wisdom of the crowd" concept, whereby "large groups of people are collectively smarter than individual experts..."
  6. While preparing the summary above, it hit me that maybe an even more accurate way to measure outstanding boardie grading performance (at least in my mind) might be to look at the average scores assigned by some number of top graders as measured by some preliminary criteria (I used the 50 award winners from my first post of this thread) in order to assign the "average top ranked boardies grade" for each book. I then recalculated all boardies scores against the "average top ranked boardies grade" across all contests in order to come up with the following results: Using this approach, jbpez once again ranks first overall with an average grade of just 12.5! TheGeneral and Electron follow fairly close behind with average scores of just 13.8 and 14.8 respectively! Although not shown in the summary above, just for kicks, I came up with a score for the CGC grader(s) across all books graded over the 6 contests, and the CGC grader(s) had a very impressive average of 14.0, which is just a bit behind the average score for jbpez and TheGeneral, when measured against the "average top ranked boardies grade".
  7. After each contest CGC posts the grade distribution from all graders. zzutak then analyzes the CGC distribution and provides various grade distribution statistic (including the mean, median, and mode), and shares a bunch of other interesting commentary. Based on the grade distribution info from CGC and grade distribution statisics from zzutak, there are some books where the average board grade is off by one or two increments from the CGC grade. Which is the more accurate grade in those cases, the grade assigned by 1 (or 2 or 3?) CGC graders, or the average grade assigned by a hundred or so boardies? Perhaps the CGC grade would approach the boardie average grade if CGC took that average grade from a higher number of graders. Due to economics, one can't really fault CGC for not doing so. Also, the more CGC graders physically handling a book, rather than simply grading from scans like we're doing the greater the risk of handling damage to book. With that said, I used my spreadsheet to calculate the mean grade for every in the contest, and then rounded that grade to the nearest actual CGC grade(e.g. if the average boardie grade was 1.6, I rounded that to a grade of 1.5, since 1.6 is closer to 1.5 than 1.8). I then measured everyone's performance against the average boardie grade and came up with the following results: Under this ranking approach, jbpez is the overall champ by a wide margin, with an average score of just 12.8!!! Congrats @jbpez !!!
  8. I think a better response from them is that they do not keep an active customer searchable register of what has been annotated on signature series books. This is quite close to something I asked about 15 years ago when I was curious about finding the serial numbers of books on the census to find which copy was graded first. CGC has and can pull up this information but chooses not to share it on the client side as it does not hold enough value for them to implement. On the other hand they might not be able to implement this type of search feature due to just how old the framework of their database is. As it has aged it has become less and less intuitive functionally and they do not seem interested in upgrading its infrastructure in any customer beneficial way. As more and more pictures of books are added can you imagine the following: Census Data Base: Search: X-Men #1 1963 9.8 Result: X-Men 1963 (Parent directory) X-Men #1 9.8 (2) (result) Select the number in parentheses and you are directed to the following serial numbers: 0631963001 0000000000 (i don't have the # for the second one) You can then click on the serial number and go right to the certification look up with all the books information. A nice clean searchable data base would be great and something no other potential competitor currently has, though beckett has feelers out currently (survey's) to select customers asking about more integrated (searchable) data base options for all of their products. That's good stuff. CGC could even keep all of their existing search functionality, where you can search on X-Men 1, then click on the link for X-Men 1 from 1963, which then shows all of the census results for X-Men 1 from 1963. CGC would then just need to do everything else you described, i.e. allow the user to click on the census count for a given grading category and grade (such as count of 2 for universal / 9.8), which would then pop up another screen showing the serial numbers for the 2 universal 9.8 copies, and then allow the user to click on the serial number to pull up all of the available certification info for that book. The OP could then check out the other 41 CGC 5.0 SS X-Men #1 books to see if any others were noted as being signed & Excelsior by Stan Lee.
  9. If someone has experience successfully copying and pasting from Excel into a boards message, please let me know; the formatting gets all messed up on me when I try to do it (hence the screen shots that I've been including in my earlier posts). Thanks : )
  10. One tool that can be used to improve one's CGC grading contest results is analyze your scores to see if you're consistently undergrade or overgrade compared to CGC. zzutak has mentioned this concept in some of his past posts: "So, just for kicks, recalculate your total contest score using "signed deltas": -2 for your guess being 2 grade increments too low, -1 for your guess being 1 grade increment too low, 0 for a bullseye, +1 for your guess being 1 grade increment too high, +2 for your guess being 2 grade increments too high, and so on. What's your cumulative 20-book tally now?" In other words, if you graded a book an 8, and CGC graded it a 9, you would have a signed delta score of -2 score for that book. Now if you graded another book a 4.0, and CGC graded it a 3.0, then you would have a signed delta score of +2 for that book. If you summed up your results across those 2 books, you would have a total signed delta score of -2 + +2 = 0, meaning that on average, you didn't undergrade or overgrade books relative to CGC. Here are the signed delta results for the award winners listed in my first post: Amongst the top 10 ranked participants: EastEnd1 has an average signed delta score of -0.2, meaning on average he just slightly undergrades compared to CGC. jbpez has an average signed delta score of -7.2, meaning he tends to undergrade compared to CGC. This means that you'll want to buy raw books from jbpez, lol. This also means that it is possible that jbpez could improve his already impressive #3 ranking if he loosened up on his grades a tab in future CGC grading contests. Get Marwood & I has an average signed delta score of 4.0, meaning that he tends to overgrade a bit compared to CGC during grading contests. Keep in mind that this is an averaged signed delta score per 20 book contest, so it's really not that high, and it translates to just a 0.2 averaged signed delta score per book, and with most grading increments equal of 0.5 (or less when you get into grades in the 9's), this has very little impact on the average grade assigned, so I'd caution anyone against over compensating when assigning grades in future grading contests if you have a relatively small average signed delta score. There are average signed delta scores close to zero amongst those in the top 10% and in the bottom 10% of average scores, so an average signed delta score close to zero doesn't necessarily translate to being an accurate overall grader, but if your average signed delta score is a big positive, or a big negative number, it might be worth adjusting your future scores a bit. I'll probably post everyone's signed delta score at some point (sorted alphabetically, without overall score) in case anyone else wants to see how much they tend to overgrade or undergrade on average.
  11. @TheGeneral I fixed it in my spreadsheet (your average rounded to one decimal place is still 17.8), so it should be good to go for any further analysis / future contest updates.
  12. Thanks General! I took a look back and you are correct; you participated in contests 2 through 6. For contest #3 the results showed you as "The General" rather than "TheGeneral", and when I sorted by name before finalizing my spreadsheet, there were a couple of other board names between those two names, so I didn't catch that (I did catch 6 other such variances between contests; hopefully I was 6 for 7 overall in catching such typos). I will fix that in my spreadsheet, but it won't really impact your ranking or average score, as you averaged 17.8 in the 4 contests where you were listed as "TheGeneral" vs. 18.0 in the 1 contest (contest #3) where you were listed as "The General".
  13. Hey DrMitchJ, See this thread for CGC Mike's explanation of contest #6 that just wrapped up a week or so ago: As Mike explains in the thread above, the top 3 winners of each contest win prize money from CGC. This thread that you're reading now just provides some additional stats for those that have partaken in multiple contests to date. I believe CGC Mike starts threads in the Comics General forum and in this "Hey buddy, can you spare a grade?" forum a few weeks or so before the start of each contest, so keep any eye out for Mike's advance notice thread(s), and join in the fun in the next contest!
  14. at least you made "a list" haha I'll see myself out You're in some lists, just not the ones published in my first post You are in the group of 99 participants that qualified for the lists in my first post above by completing 2.75+ contest. In your case, as you mentioned, you have participated in all the contests. In fact, you are one of just 14 graders that has submitted grades for every round so far (ignoring the 4th round of the first contest for those that didn't qualify for that round). I missed the signup deadline for the first contest by a day or two, or I'd be in that club too. Perfect Attendance Group of 14: jbpez Motor City Rob WilliamLunt Point Five zzutak grendelbo pastandpresentcomics silverseeker Axelrod davidtere comicginger1789 jas1vans musicmeta ADAMANTIUM
  15. Nice effort! Food for thought (for anyone reading this thread): Which performance is "better" -- an average score of 18 in Contests #3 and 4 (where the top scores were 18 and 17), or an average score of 16 in Contests #1, 2, 5, and 6 (where the top scores were 11, 15, 11, and 13). Thanks, zzutak. In my opinion the answer to your question is somewhat subjective, as there are many factors in play. Here is how I would view it: Following are the overall average scores per round for contests 1 through 6: 34, 31, 30, 32, 29, 29 I believe Mike advertised contest 1 outside of the boards if I recall correctly, so it isn't too surprising that the average score for contest 1 was a bit higher than the average for contests 2 through 6. The average scores for contests 2 through 6 are all 30 +/- 2, which to me indicates that any given contest isn't really that much "easier" or "harder" than another contest. As such, I would argue that the lower average score in any given contest is the "better" performance (although if we're talking scores of 16 and 18, the performance is nearly the same).
  16. In zzutak's ranking system you just got 3 points for an 8th to 12th finish, but you regularly finished close to the top 10, so under my ranking system you do much better than someone else that might have finished say 6th in one contest and collected 5 points, but finished much lower in the other contests. You are on the cusp of joining the illustrious Diamond Club ranking.
  17. I totally agree. Surprisingly (to me) I'm currently there, but I really don't expect to stay there, as a score of 20 or better leaves very little room for error. Over 2.75+ rounds, I would argue that an average of 25 or better is pretty remarkable, and 30 or better very respectable.
  18. @Point Five @JollyComics @zzutak @Frederic9494 @MBFan @grendelbo @Morganmi @pastandpresentcomics @mytastebud @Yorick @JohnH19 @Warlord @batcollector @flashlites @PINWHEEL @Mars76 @silverseeker @JWKyle @CJ Design @thehumantorch @Axelrod @universal soldier @Hudson @Kramerica @davidtere @JustJimN @sledgehammer @Old Fashion PB and J @AJD @Mr. Zipper @KryptoMayor @stynxno @Cerebus3000 @scburdet @comicginger1789 @Hibou @toro @skypinkblu @Wipple @mysterymachine Special thanks to @CGC Mike for the great contests. Great job working with all the data too; I didn't have to make very many data tweaks in order to pull this all together (most tweaks were pretty minor like the occasional typo on boardie names in the results summaries, that I could easily identify and fix). If anyone else not included in the top 50 rankings above would like their score, just post in this thread, and I'll reply in the thread with your score and contests completed (or PM me and I'll PM your score and contests completed to you).
  19. MVP: @Squirrel Guy Diamond Club: @TheGeneral @jbpez @Superman2006 @Withering Wind @Electron @Motor City Rob @Get Marwood & I @WilliamLunt @EastEnd1
  20. There are different ways that one could slice and dice the grading contest results over the last 6 grading contests to come up with cumulative rankings. The approach I took was to come up was to rank all participants based on their average score over 4 rounds. If all 6 contests to date allowed everyone to participate in all rounds, I would have set the minimum participation level equal to 3 contests worth of participation, however, given the limitations of the 1st contest, I decided to set the minimum participation level equal to 2.75 contests in order to qualify for the various award categories below. Without further adieu, following are the "MVP", "Diamond Club", and "Honor Roll" award winners based on performance through the first 6 CGC grading contests: You'll notice that in the rankings above, I also show the ranking based on an alternative ranking approach shared by zzutak in another thread, where zzutak applied the following ranking approach that only gives weight to top 10 finishers: "... I decided to compare the Top-10 finishers in each of the first six CGC Quarterly Grading Contests. I awarded 10 pts for a 1st Place, 9 pts for a 2nd Place, and so on (down to 1 pt for a 10th Place). When there were multiple-player ties, I awarded each player in the tied group the number of points corresponding to the best place occupied. For example: a three-player tie for 4th-6th Place would earn each player 7 points; a four-player tie for 6th-9th Place would earn each player 5 points; a five-player tie for 8th-12th Place would earn each player 3 points...". The results from zzutak's points ranking system are generally in line with the more inclusive (and data intensive) ranking system that I applied (especially for the top 10 ranked finishers, as one would expect). Congrats to Squirrel Guy and all other award winners! A few additional thoughts based on contest results through the first 6 rounds: 1. 391 participants have submitted grades for 1 or more rounds 2. 99 participants have submitted grades for 2.75+ contests, and thus qualified for the awards above. The average grade for those 99 participants ranges from 16.7 points per contest (i.e. per 4 rounds) for Squirrel Guy to 46.7 points per round for the 99th ranked qualifying participant. 3. To summarize, the average grades at or around the bounds of each award (and non-award) category: for the 1st, 11th, 51st, and 99th qualifying participants, the average grades were 16.7, 22.3, 29.4, and 46.7.
  21. I think it's pretty safe to say that any player with a total score of 20 or less in any single CGC contest is a good grader. Although with "just" 20 books, I think it is possible that the stars just aligned for them on a single contest, but if they repeat that performance (or don't deviate too far from it) over multiple contests, then I think it's pretty safe to say they're an outstanding grader. Regarding your 2nd point, a low total signed delta score on its own doesn't necessarily mean that someone is a good grader, as even a terrible grader could end up with a total signed delta score that is close to zero (I'm not suggesting that you implied otherwise). I agree that the total signed delta calculation is a good exercise for anyone to perform to see if they are consistently over or under-grading a comic compared to CGC, and something that could help them improve their grading skills in the future. e.g. A not so great grader with a total score of 56 could still end up with a total signed delta score of zero if they threw darts to come up with their grades similar to the following; 0 Bullseyes = 0 1 times a guess was 1 grade increment too low = 0 3 times a guess was 1 grade increment too high = +3 3 times a guess was 2 grade increments too low = -6 2 times a guess was 2 grade increments too high = +4 3 times a guess was 3 grade increments too low = -9 1 times a guess was 3 grade increments too high = +3 2 times a guess was 4 grade increments too low = -8 3 times a guess was 4 grade increments too high = +12 1 times a guess was 5 grade increments too low = -5 0 times a guess was 5 grade increments too high = 0 0 times a guess was 6 grade increments too low = 0 1 times a guess was 6 grade increments too high = +6 Total signed delta score = 0
  22. I just saw that zzutak posted another type of ranking across multiple contests on Monday; somehow I missed that post until now. It looks like we both had the same idea in mind for my item (b) above, albeit with slightly different approaches to coming up with a ranking.
  23. I was curious to see (a) how my grading contest scores have changed from contest to contest and (b) how my average grading contest skills stack up against others. I mocked up the following Summary template that could be used to meet both of those objectives. Since not all boardies will have competed in all 6 contests to date, the "Total Contests" column shows how many contests each boardie competed in, to give some sense as to the credibility of a boardie's average score. The "Contest Average" is calculated as the Total of the "Scores by Contest" / Total of the "Rounds Completed by Contest" * Total rounds per contest , e.g. in my case 96 / 20 * 4 = 19.20 If for whatever reason I had only partook in the first 2 rounds in contest #6, and my score for those 2 rounds was 10, then my total would have been 89 (instead of 96), and my rounds completed would have been 18 (instead of 20), giving me a "Contest Average" of 89 / 18 * 4 = 19.78. In other words, the approach I have proposed for calculating "Contest Average" works even if someone didn't complete all of the rounds in a given contest. I figured I'd check to see if anyone else was interested in seeing this type of a grading score summary / ranking before trying to pull the data together into Excel and populating the summary below any further. If someone was interested in becoming a grader for CGC and they had a good grading contest average score and decent number of total contests (or improved performance over time), that might not be a bad thing to mention in their cover letter / resume. Contest Total Scores by Contest Rounds Completed by Contest Boardie Average Contests #1 #2 #3 #4 #5 #6 Total #1 #2 #3 #4 #5 #6 Total Superman2006 19.20 5.0 0 18 20 21 20 17 96 0 4 4 4 4 4 20