Biblioblog Top 50 (October, 2010)

This post is a bit late, but among the top 50 biblioblogs for October, 2010, the top 10 student biblioblogs are:

StudentOverallAuthor(s)BlogAlexa Score
12Joel L. WattsUnsettled Christianity95521
28Scott BaileyScotteriology212042
312Jeremy ThompsonFree Old Testament Audio Website Blog294803
415Jonathan RobinsonXenos300343
518Brian LePort, JohnDave Medina, and Robert JimenezNear Emmaus: Christ and Text382933
621Mark StevensScripture, Ministry, and the People of God420079
722Phillip LongReading Acts431256
825S. DemmlerYou Can’t Mean That!503362
926Gavin RumneyOtagosh503927
1029Bacho BordjadzeReading Isaiah533766

As always, updates and corrections are welcome, particularly for those who may have recently matriculated or graduated.

Some of the links above may be “affiliate links.” If you make a purchase or sign up for a service through one of these links, I may receive a small commission from the seller. This process involves no additional cost to you and helps defray the costs of making content like this available. For more information, please see these affiliate disclosures.


4 responses to “Biblioblog Top 50 (October, 2010)”

  1. Kirk Lowery Avatar
    Kirk Lowery

    I’m curious: what criteria are used to determine the top 10 and top 50 blogs? Number of visitors? So popularity? Or?

    1. David Stark Avatar

      The overall top 50 list that Jeremy maintains and produces is, as I understand it purely based on the, mostly traffic dependent, site rankings provided by the Alexa.com for the list of bibliobloggers that he has. I then work through this list and identify the subset of the top ten student bibliobloggers, obviously as measured by this same instrument. Thus far, there have always been at least ten student bibliobloggers in the list. If that happens to change in a given month, though, I might have to go with the top nine or something. 🙂

  2. Kirk Lowery Avatar
    Kirk Lowery

    Thanks for the response. That is what I thought.

    Why is popularity the primary criterion for importance? Should it be THE criterion? In other words, this list is not very useful for me. It might highlight a good blog, but good blogs are not always popular. The biblioblogs I follow (including yours!) are not on this list, nor the list of 50.

    The whole idea of blog rating needs to be rethought from the ground up, IMO.

    1. David Stark Avatar

      You are, of course, quite right. Judging “good books” simply to be those that appear on the New York Time’s Best-Seller List would be similarly problematic. That’s one reason that I don’t put too much weight on the qualitative accuracy of these lists. There are a number of blogs that are ranked lower on or are absent from the list—like yours—to which I tend to pay more attention and find more helpful than those that have higher Alexa rankings. So, qualitatively speaking, the monthly lists have mainly amusement value at present, for me at least, but of course, if there were a different metric or combination of metrics that could be used to produce a more qualitatively accurate ranking, then that would be very much preferable. There was some discussion about this issue a while back, but I don’t remember seeing, nor have I since conceived of, another workable metric or combination of metrics. As I write this reply comment, though, the whole issue does strike me as something that could benefit from an empirical-humanistic perspective that could perhaps put some more objective criteria to an otherwise more subjective, qualitative task. 🙂 Any thoughts?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.


Posted

by