Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
History
Fiction
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts115/v4/0a/3b/16/0a3b1626-45b0-a12d-cd67-87967fbfa591/mza_9824414842294892814.jpg/600x600bb.jpg
Voltaire Foundation
Oxford University
10 episodes
6 months ago
Annual Voltaire Foundation Lecture on Digital Enlightenment Studies: Mikko Tolonen on Books as Objects, Data and Meaning: A Computational Approach to Eighteenth-Century Book and Intellectual History In this lecture, I will present a framework developed by the Helsinki Computational History Group and implemented together with its partners for investigating eighteenth-century book and intellectual history through three interconnected lenses: Books as Objects, Books as Data and Books as Meaning. We treat bibliographical detail as a key factor for understanding the flow of ideas, examining also the physical attributes of books including the ornaments embedded within them. Using a dedicated machine learning pipeline, we automatically extract and categorize these ornaments at scale from the ECCO corpus, then integrate the results with bibliographical metadata. This approach enables us to trace publishing practices and how books circulated and transformed within the broader distribution of intellectual traditions, thus shedding new light on the activities of publishers and printers, such as Jacob Tonson and John Watts. We also approach Books as Data by leveraging computational methods for text reuse, translation mining and cross-lingual investigation for reception studies—particularly focusing on English, Scottish and French Enlightenment corpora drawn from ECCO and Gallica. These pipelines illuminate textual overlaps and uncover patterns of influence, offering insights into large-scale cultural and historical questions—for instance, the eighteenth-century reception of David Hume’s essays—and providing the means to reevaluate fundamental issues such as the boundaries of translations. Finally, we turn to Books as Meaning by applying cutting-edge large-language models. Through “meaning matching”, we not only quantify textual overlaps but also track how semantic content evolves over time, capturing cultural shifts once considered out of reach for computational study. By combining physical features of the books, computational workflows and interpretive practices, this threefold perspective—object, data, meaning—expands our capacity to analyze and reconstruct the multifaceted history of books, authors and ideas in the eighteenth century.
Show more...
Education
RSS
All content for Voltaire Foundation is the property of Oxford University and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Annual Voltaire Foundation Lecture on Digital Enlightenment Studies: Mikko Tolonen on Books as Objects, Data and Meaning: A Computational Approach to Eighteenth-Century Book and Intellectual History In this lecture, I will present a framework developed by the Helsinki Computational History Group and implemented together with its partners for investigating eighteenth-century book and intellectual history through three interconnected lenses: Books as Objects, Books as Data and Books as Meaning. We treat bibliographical detail as a key factor for understanding the flow of ideas, examining also the physical attributes of books including the ornaments embedded within them. Using a dedicated machine learning pipeline, we automatically extract and categorize these ornaments at scale from the ECCO corpus, then integrate the results with bibliographical metadata. This approach enables us to trace publishing practices and how books circulated and transformed within the broader distribution of intellectual traditions, thus shedding new light on the activities of publishers and printers, such as Jacob Tonson and John Watts. We also approach Books as Data by leveraging computational methods for text reuse, translation mining and cross-lingual investigation for reception studies—particularly focusing on English, Scottish and French Enlightenment corpora drawn from ECCO and Gallica. These pipelines illuminate textual overlaps and uncover patterns of influence, offering insights into large-scale cultural and historical questions—for instance, the eighteenth-century reception of David Hume’s essays—and providing the means to reevaluate fundamental issues such as the boundaries of translations. Finally, we turn to Books as Meaning by applying cutting-edge large-language models. Through “meaning matching”, we not only quantify textual overlaps but also track how semantic content evolves over time, capturing cultural shifts once considered out of reach for computational study. By combining physical features of the books, computational workflows and interpretive practices, this threefold perspective—object, data, meaning—expands our capacity to analyze and reconstruct the multifaceted history of books, authors and ideas in the eighteenth century.
Show more...
Education
https://is1-ssl.mzstatic.com/image/thumb/Podcasts115/v4/0a/3b/16/0a3b1626-45b0-a12d-cd67-87967fbfa591/mza_9824414842294892814.jpg/600x600bb.jpg
Digital Rhetoric, literae humaniores and Leibniz's dream
Voltaire Foundation
42 minutes
7 years ago
Digital Rhetoric, literae humaniores and Leibniz's dream
Willard McCarty, King's College, London, gives the 2017 Besterman lecture. If the digital computer is to be a 'machine for doing thinking' in the arts and letters, rather than merely a way of automating tasks we already know how to perform, then its constraints and the powers these constraints define need to be understood. This lecture explores those constraints and powers across the three stages of modelling a research problem: its translation into discrete, binary form; manipulation by the machine; and re-translation into scholarly terms.
Voltaire Foundation
Annual Voltaire Foundation Lecture on Digital Enlightenment Studies: Mikko Tolonen on Books as Objects, Data and Meaning: A Computational Approach to Eighteenth-Century Book and Intellectual History In this lecture, I will present a framework developed by the Helsinki Computational History Group and implemented together with its partners for investigating eighteenth-century book and intellectual history through three interconnected lenses: Books as Objects, Books as Data and Books as Meaning. We treat bibliographical detail as a key factor for understanding the flow of ideas, examining also the physical attributes of books including the ornaments embedded within them. Using a dedicated machine learning pipeline, we automatically extract and categorize these ornaments at scale from the ECCO corpus, then integrate the results with bibliographical metadata. This approach enables us to trace publishing practices and how books circulated and transformed within the broader distribution of intellectual traditions, thus shedding new light on the activities of publishers and printers, such as Jacob Tonson and John Watts. We also approach Books as Data by leveraging computational methods for text reuse, translation mining and cross-lingual investigation for reception studies—particularly focusing on English, Scottish and French Enlightenment corpora drawn from ECCO and Gallica. These pipelines illuminate textual overlaps and uncover patterns of influence, offering insights into large-scale cultural and historical questions—for instance, the eighteenth-century reception of David Hume’s essays—and providing the means to reevaluate fundamental issues such as the boundaries of translations. Finally, we turn to Books as Meaning by applying cutting-edge large-language models. Through “meaning matching”, we not only quantify textual overlaps but also track how semantic content evolves over time, capturing cultural shifts once considered out of reach for computational study. By combining physical features of the books, computational workflows and interpretive practices, this threefold perspective—object, data, meaning—expands our capacity to analyze and reconstruct the multifaceted history of books, authors and ideas in the eighteenth century.