Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
TV & Film
Sports
Health & Fitness
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts115/v4/8f/7d/d4/8f7dd461-0b03-1791-fe91-6bbecda3c1fb/mza_6583983262724034798.jpg/600x600bb.jpg
MIT Comparative Media Studies/Writing
Massachusetts Institute of Technology
406 episodes
3 days ago
How and why, in the latter half of the twentieth century, did informatic theories of “code” developed around cybernetics and information theory take root in research settings as varied as Palo Alto family therapy, Parisian semiotics, and new-fangled cultural theories ascendant at US liberal arts colleges? Drawing on his recently published book “Code: From Information Theory to French Theory,” and primary sources from the MIT archives, this talk explores how far-flung technocratic exercises in Asian colonies and MIT’s Research Laboratory of Electronics (RLE) inspired these varied and diverse audiences in a common dream of “learning to code.” The result is a new history of the ambitions behind the rise of “theory” in the US humanities, and the obscure ties of that endeavor to Progressive Era technocracy, US foundations, and the growing prestige of technology and engineering in 20th century life. Bernard Dionysius Geoghegan is a Reader in the History and Theory of Digital Media at King’s College London. An overarching theme of his research is how “cultural” and “humanistic” sciences shape—and are shaped by—digital media. His attention to cultural factors in technical systems also figured in his work as a curator, notably for the Anthropocene and Technosphere projects at the Haus der Kulturen der Welt. Duke University Press recently published his book Code: From Information Theory to French Theory (2023), based partly on archival research he undertook as a visiting PhD student at MIT around 2008.
Show more...
Education
RSS
All content for MIT Comparative Media Studies/Writing is the property of Massachusetts Institute of Technology and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
How and why, in the latter half of the twentieth century, did informatic theories of “code” developed around cybernetics and information theory take root in research settings as varied as Palo Alto family therapy, Parisian semiotics, and new-fangled cultural theories ascendant at US liberal arts colleges? Drawing on his recently published book “Code: From Information Theory to French Theory,” and primary sources from the MIT archives, this talk explores how far-flung technocratic exercises in Asian colonies and MIT’s Research Laboratory of Electronics (RLE) inspired these varied and diverse audiences in a common dream of “learning to code.” The result is a new history of the ambitions behind the rise of “theory” in the US humanities, and the obscure ties of that endeavor to Progressive Era technocracy, US foundations, and the growing prestige of technology and engineering in 20th century life. Bernard Dionysius Geoghegan is a Reader in the History and Theory of Digital Media at King’s College London. An overarching theme of his research is how “cultural” and “humanistic” sciences shape—and are shaped by—digital media. His attention to cultural factors in technical systems also figured in his work as a curator, notably for the Anthropocene and Technosphere projects at the Haus der Kulturen der Welt. Duke University Press recently published his book Code: From Information Theory to French Theory (2023), based partly on archival research he undertook as a visiting PhD student at MIT around 2008.
Show more...
Education
https://i1.sndcdn.com/artworks-Ip43QXaEhUxNb29b-HEaC9A-t3000x3000.jpg
Eric Freedman, "Non-Binary Binaries and Unreal MetaHumans"
MIT Comparative Media Studies/Writing
1 hour 19 minutes 44 seconds
3 years ago
Eric Freedman, "Non-Binary Binaries and Unreal MetaHumans"
Video game engines have promoted a new cultural economy for software production and have provided a common architecture for digital content creation across what were once distinct media verticals—film, television, video games and other immersive and interactive media forms that can leverage real-time 3D visualization. Game engines are the building blocks for efficient real-time visualization, and they signal quite forcefully the colonizing influence of programming. Video game engines are powering our visual futures, and engine developers that include Unity Technologies and Epic Games are rapidly iterating their products to tackle new markets, where data and visuality continue to converge. This analysis, which draws from software studies and studies of visual culture, examines a tool that is fairly new to the Epic Games arsenal—the in-development MetaHuman Creator that is part of Epic’s proprietary Unreal Engine. The MetaHuman Creator is a cloud-streamed application that draws from a library of real scans of people and allows 3D content developers to quickly create unique photorealistic fully-rigged digital humans. MetaHuman creation is a fluid of process, and the speedy transformation of character rigs and other non-binary attributes highlights the potential queerness or openness of data. Yet the ongoing push toward (hyper)realism in commercial media has birthed a visual economy which is supported by an industrial apparatus that privileges mastery over the tools of production, and where bodies and politics are often cleaved in the design process. Epic’s multiethnic, multiracial, transgender MetaHuman Creator is a design tool and not a narrative engine. Its transitions are simple and seamless, and the traces of non-binary and non-white identities are simply part of a larger color palette. These tools represent a way of seeing and knowing the world, and the representations they produce are part of hermetically-sealed and privately-held encoding processes that include a company’s original data, its application programming, its proprietary build environment and its interface. This analysis poses two interrelated questions. Are the MetaHuman Creator and similar simplified building tools democratizing the field of digital content creation? Are they fostering more diverse representations and narratives, and supporting the free play of identity in playable media? Eric Freedman is Professor and Dean of the School of Media Arts at Columbia College Chicago. He is the author most recently of The Persistence of Code in Game Engine Culture (2020), as well as Transient Images: Personal Media in Public Frameworks (2011). He serves on the editorial board of the International Journal of Creative Media Research and the advisory board of the Communication and Media Studies Research Network.
MIT Comparative Media Studies/Writing
How and why, in the latter half of the twentieth century, did informatic theories of “code” developed around cybernetics and information theory take root in research settings as varied as Palo Alto family therapy, Parisian semiotics, and new-fangled cultural theories ascendant at US liberal arts colleges? Drawing on his recently published book “Code: From Information Theory to French Theory,” and primary sources from the MIT archives, this talk explores how far-flung technocratic exercises in Asian colonies and MIT’s Research Laboratory of Electronics (RLE) inspired these varied and diverse audiences in a common dream of “learning to code.” The result is a new history of the ambitions behind the rise of “theory” in the US humanities, and the obscure ties of that endeavor to Progressive Era technocracy, US foundations, and the growing prestige of technology and engineering in 20th century life. Bernard Dionysius Geoghegan is a Reader in the History and Theory of Digital Media at King’s College London. An overarching theme of his research is how “cultural” and “humanistic” sciences shape—and are shaped by—digital media. His attention to cultural factors in technical systems also figured in his work as a curator, notably for the Anthropocene and Technosphere projects at the Haus der Kulturen der Welt. Duke University Press recently published his book Code: From Information Theory to French Theory (2023), based partly on archival research he undertook as a visiting PhD student at MIT around 2008.