Saturday, October 10, 2015

A question about diamond readings in NA28

0 Comment(s) +
There are obviously many great things about NA28, but there are also some ongoing niggles. One thing I've come across recently is the situation with some of the "diamond" readings in the Catholic Epistles. These "diamond" readings are those where the editors could not decide between readings, and in the ECM2 the primary text line has both readings (although in some cases other readings are involved, the indecision in each case relates to two readings). For these readings the editors are saying, as I understand it, our use of the normal methods and our use of the CBGM have not been able to resolve the question as to which reading represents the initial text, so as a consequence the decision is left open.

So this is all fair enough within their parameters. The way the initial text is decided, and the way in which the CBGM is the product of these decisions at local stemma level, means that there are only two options: editorial unanimity or editorial impasse. There is no mechanism (as in the previous "Nestle" model) for a majority vote at points of uncertainty. In the ECM approach we are left with editorial impasse.

The NA28 approach is to leave the old text in place within the text (NA26&27, but in many cases reaching back to N1), mark it with a diamond (indicating that the editors leave the decision open), and use the same diamond marker in the apparatus to mark the second (equally viable) reading.

But here is the thing I can't understand. On a number of occasions the apparatus does not provide the manuscript evidence for the txt reading. I would have thought in situations of such uncertainty, when you are effectively leaving the reader to make his or her own judgement, you would always provide (in summary) the evidence for both (equally viable) readings. But that doesn't happen. In this mornings reading I came across four examples where there was no indication of the evidence in favour of the txt reading: 1 John 2.4 (whether or not to read OTI), 2.6 (whether or not to read OUTWS), 2.17 (whether or not to read AUTOU), 2.29 (whether to read EIDHTE or IDHTE). You can get a bit more info from NA27 (but only on 1 John 2.6).

There may be an explanation for this, but I have struggled to think of a good one. I would suggest that in the next revision for all diamond readings evidence from both options is provided in the apparatus.

Friday, October 09, 2015

Pioneers of the Trade: Famous Text-Critical Scholars (CSNTM)

Comment(s) +
Center for the Study of New Testament Manuscripts (CSNTM) offers a number of interviews in a series called "Pioneers of the Trade: Famous Text-Critical Scholars"  which is freely available on iTunes-U.

A few days ago, a new interview appeared with Dr. Ekaterini Tsalampouni, lecturer in NT at the Aristotle University in Thessaloniki, in which she describes her research on New Testament inscriptions, the Byzantine text, and how the digitization of New Testament manuscripts is revolutionizing the field of New Testament textual criticism.

I was interviewed at the recent SBL in San Diego for this series, so that might appear too some time in the future.

Thursday, October 08, 2015

Destroying a Manuscript to Test Multispectral Imaging

0 Comment(s) +
Image setup for the experiment.
There is a fascinating new article out today in the journal Digital Scholarship in the Humanities (more on which later) on multispectral imaging (MSI). What makes this article so interesting is that the authors destroyed a manuscript in order to systematically put MSI through its paces.

Here is how the authors sum up their aim (my emphasis):
Previous multispectral imaging capture projects, applied to specific examples of texts of historical importance, have concentrated on recording documents in their current state (generally once important features are illegible). Here, we investigate best practice in the multispectral imaging of heritage material by imaging a parchment document before and after a series of degradation processes, allowing us to assess the effectiveness of image processing algorithms to recover information from degraded documents. This gives us a unique platform for evaluating the quality of recovered images, and allows us to assess the performance of image processing algorithms for analysis of these images.
In other words, the authors applied MSI to a manuscript with no damage, then systematically destroyed the manuscript in various ways only to re-apply MSI so that they would have a more objective measure of best practices. The unlucky specimen is described thus:
The document was an assignment of property which had been deemed to hold no historical or scholarly value, and had been de-accessioned from their collection prior to our request for parchment material in accordance with The National Archives guidance on deaccessioning and disposal.
The unlucky specimen here showing the samples.
From this manuscript the authors cut 23 specimens, all with text and generally avoiding folds and blemishes. Various “degradations” were then applied to 20 of these, ranging from problems in the manufacture, use, and storage of the parchment. This included anything from scraping the ink off to oil stains, chemical reagents, fire, water, smoke, and, yes, human blood (full list here). The specimens were then photographed under the same conditions both before and after the damage was done. In total, they took 2,800 images!

The blood sample before damage (left), after (center), and the best result from MSI (right).
Naturally, they found that MSI had different levels of success with generally more success.
In some samples in which the writing has been rendered unreadable by the treatment, the writing can be recovered, including aniline dye, oil, and blood. In some samples the writing is completely obscured or the parchment has been severely affected and recovery is all but impossible, including iron gall ink, India ink, and mould. In most cases, however, the image processing algorithms can extract more information from the multispectral images of treated samples corresponding to the writing than the human eye can see.
The real value of the study, of course, lies in the ability to measure the relative success and to isolate the specific circumstances in each case. They don’t give all the results in the paper (they do elsewhere) and they are clear that this is not a systematic study of any one type of damage (although they hope to in later studies).

This method raises questions about when to destroy the past in order to better understand it, but I’ll leave those issues for you all to discuss. For my part, I would have liked to know a little bit more about how this document was “deemed to hold no historical or scholarly value.” Apparently there exist guidelines for just such a decision.

I’ve never been involved in the direct process of applying MSI, but I imagine this study would be very useful to those weighing the cost benefit of using MSI.

You can read the whole article online (free).

Tuesday, October 06, 2015

Recent Journal Articles on Textual Criticism

4 Comment(s) +
While looking at the new journals shelf last week I noticed a number of text critical articles have come out recently.

Novum Testamentum  57.5

This article investigates the textual history of the explicit quotations of Isaiah in the Acts of the Apostles of Codex Bezae Cantabrigiensis (Acts 7:49–50; 13:34; 13:47) by introducing the concept of “Old Testament awareness.” This concept can be defined as the degree to which a NT tradition, at any stage of its transmission history, is aware of a quotation stemming from the OT. OT awareness can be identified in the layout of Codex Bezae (e.g., the indentation of text in the manuscript to indicate OT quotations), the text of quotations (e.g., readings that can be shown to be a subsequent change towards an OT tradition) and the context of the quoted text (e.g., the quotations’ introductory formulae). Through assessing the OT awareness of Codex Bezae’s explicit quotations of Isaiah, different stages in the transmission history of the text of these quotations in Codex Bezae’s Acts can be identified.
Laurent Pinchard, Des traces vétérotestamentaires dans deux variantes du Codex de Bèze (Mt 26,55 et 28,8) jugées harmonisantes, pp. 418–430
Codex Bezae is traditionally famous for its harmonising tendency compared to other early majuscule manuscripts of the Gospels. In this article we suggest that, based on two examples drawn from Matthew, some of its variant readings have striking lexical correspondence with passages from the Old Testament. As a result, it is more likely that they probably transmit an original reading as opposed to being the result of a less capable scribe, who would have corrected an earlier text to make it closer to the parallel passages from the Synoptics. The passages examined are Jesus’ arrest on the Mount of Olives (Mt 26.55) and the women’s encounter at the tomb on Easter day (Mt 28.8).
Also in NovT, Simon Crisp and J. K. Elliott review vols. 1–2 of the New Cambridge History of the Bible  and Hugh Houghton reviews Die Vetus Latina-Fragmente aus dem Kloster St. Gallen.

New Testament Studies 61.4

Joel D. Estes, Reading for the Spirit of the Text: nomina sacra and πνεῦμα Language in P46, pp. 566–594
This study examines every reference to πνεῦμα in NT Papyrus 46 (P. Chester Beatty ii / P. Mich. Inv. 6238) and whether or not it is contracted as a nomen sacrum. Against expectations, the scribe does not always use nomina sacra to designate the divine Spirit, nor are other kinds of spirits always written out in full. This discovery destabilises the assumption that we can access the scribe’s understanding of πνεῦμα simply by identifying where nomina sacra do and do not occur. At the same time, such scribal irregularity itself may illustrate wider theological ambiguities among some early Christian communities concerning the status and role of the Holy Spirit.
Peter Malik, The Corrections of Codex Sinaiticus and the Textual Transmission of Revelation: Josef Schmid Revisited, pp. 595–614
The role of manuscript corrections in studying textual transmission of the New Testament has been long recognised by textual critics. And yet, the actual witness of corrections may at times be difficult to interpret. A case in point is Josef Schmid’s seminal work on the text of Revelation. Following Wilhelm Bousset, Schmid argued that a particular group of corrections in Codex Sinaiticus reflected a Vorlage with a text akin to that of the Andreas text-type. By dating these corrections – unlike Bousset – to the scriptorium, Schmid utilised their witness to trace the text of Andreas back to the fourth century. Recently, Juan Hernández has shown that the corrections cited by Schmid were significantly later, hence calling his fourth-century dating of Andreas (among other things) into question. Through an analysis of the corrections cited by Schmid, supplemented by a fuller data-set of Sinaiticus’ corrections in Revelation, this study seeks to reappraise Schmid’s claims concerning the textual relations of these corrections, and identify their role in the later transmission of the text of Revelation.

Tyndale Bulletin 66.1

Lincoln Blumell, A New LXX Fragment Containing Job 7:3–4 and 7:9, pp. 95–101

This article presents an edition of a papyrus fragment from LXX Job that is housed in the Hatcher Graduate Library at the University of Michigan. The fragment likely dates to the sixth century A.D. and comes from a codex. On the recto the fragment contains Job 7:3–4 and on the verso Job 7:9. [Includes two black and white photos.]

Tuesday, September 29, 2015

Duplicating a Passage: Scribes getting in the way of the Apparatus

6 Comment(s) +
Minuscule 1573 writes the section Mt 25:22-23 twice. Not good.

This is the relevant section (images available here and here at the NT.VMR):

I don’t know who is responsible for crossing out the duplicated words, it may be the scribe, it may be a later reader / corrector.

What makes this duplication interesting is that the first time round, 1573 reads προσελθων και ο τα δυο ταλαντα (without δε, and only with Sinaiticus and Vaticanus), but the second time προσελθων δε και κτλ. with the rest of the Greek tradition. Legg cites 1573 as siding with the minority reading, but this is only correct when we look at the first occurence, since the second, crossed out version has the ‘normal’ text.

I am not sure how this could be put in an apparatus. Perhaps something like 1573primus and 1573secundus? Or should we just label the duplicated bit 1573dupl.? I assume this problem has been addressed in textual criticism somewhere, but I shouldn’t know where.

Monday, September 28, 2015

Ad Fontes, Ad Futura: Erasmus’ Bible and the Impact of Scripture

2 Comment(s) +
The weather probably isn’t too bad in February.
Another conference ETC readers might be interested in.

February 25-27, 2016
Houston Baptist University

In celebration of upcoming 500th anniversary of Erasmus’ Greek text and the Reformation, the Department of Theology at HBU, in conjunction with the Dunham Bible Museum, is pleased to host the conference Ad Fontes, Ad Futura: Erasmus’ Bible and the Impact of Scripture. The conference will consider the textual and historical issues surrounding the development of the Bible, the Bible’s impact on human society across the centuries, and the future of Biblical translation and interpretation in the future. Our keynote speakers include Craig Evans (Houston Baptist University), Timothy George (Beeson Divinity School, Samford University), Herman Selderhuis (Theological University Apeldoorn) and Daniel Wallace (Dallas Theological Seminary). The plenary talks are free and open to the public.

We also invite proposals for short papers from scholars and graduate students from a wide array of disciplines and topics, including:
  • The historical context, and textual tradition, of the Biblical canon;
  • The history of the Greek text of the Bible;
  • The social and/or cultural impact of the Bible in any historical period or location;
  • The Bible and the history of the book;
  • Modern Bible translations and translation practice;
  • Textual and cultural issues concerning the Bible in the Digital Age.
Anyone who is interested should submit a 300 word abstract on any relevant topic. Papers should be 20 minutes long, and decisions will be announced before January 8, 2016. Send proposals to Jason Maston at [I can’t find a submission deadline. Submissions are due by Dec. 9.]

Wednesday, September 23, 2015

Call for Papers: Digital Editions: Academia, Society, Cultural Heritage

0 Comment(s) +
The Cologne Center for eHumanities is organizing the second DiXiT convention, taking place 16-18 March 2015 in Cologne, Germany. The conference will be preceded by a day dedicated to workshops on:
  • Publishing Models for Digital Scholarly Editions
  • Aggregation of Digital Cultural Content and Metadata Mapping
  • XML-Free Scholarly Editing
The convention organizers invite contributions from everyone working in the field of scholarly editing and its neighbouring areas. Early career scholars are welcome.
  • While the convention is open to any research about digital scholarly editing, the focus will be on its relation to academia, society and cultural heritage. As such, topics for the sessions may especially include:
  • textual criticism and the future of the high standard critical edition
  • open/public knowledge: mutual benefit for academia & society
  • social editing, crowdsourcing, citizen science
  • issues of rights and ethics related to scholarly editions
  • scholarly curation and usage of cultural heritage data
  • museums, libraries & archives as data providers for the edition
  • dissemination, sustainability and addressability of digital heritage assets
  • publishing the edition and the role of publishers
  • editors and the job market: career prospects
  • and others
The deadline for submission is 16 October 2015. More info on submissions here.