• What this course is about: how our digital technologies impact society, positively or negatively, and in particular what this means for self-organization and democracy..

    • Computers are powerful tools enabling people to communicate, cooperate, deliberate, make decisions, and innovate in a multitude of ways.  Some uses:

      • Communication: Alice sends Bob an E-mail or makes a conference call.

      • Association: a set of users form an online discussion group or forum.

      • Cooperation: users work together to write or design something together.

      • Curation: users finding, filtering, vetting, organizing relevant content.

      • Deliberation: users employ formal processes to debate issues or policy.

      • Decision: users employ social choice (e.g., voting) to decide collectively.

      • Organization: users form organizations/societies, possibly decentralized.

      • Governance: users employ processes to govern membership, behavior.

      • Incentives: digital payments, cryptocurrencies, finance and investment.

      • Evolution: collectively finding and introducing better ways to self-organize.

    • Promises of computing and networking technology as a “democratizing” force:

      • Inclusiveness:

        • Borderless communication with anyone anywhere on Earth

        • Access to information for “everyone”

        • Ability to collaborate productively with anyone globally

        • Ability to associate with like-minded people anywhere

      • Support for individual freedoms:

        • Speech: privacy, anonymity supports expression of true sentiment

        • Information: access to [good?] information from anywhere

        • Association: find, form groups with like-minded people anywhere

        • Press: anyone can publish blogs, social media, etc.

      • Resistance to control, censorship by entrenched authorities

        • Internet as a tool to “route around” failures, censorship

        • Cryptocurrencies as tools for bottom-up economic empowerment

    • But these tools sometimes succeed, but often fail to serve people in a multitude of ways, either due to flaws in the abstraction or flaws in the implementation.

      • Abstraction flaws and challenge examples:

        • Unscalable communication or human attention costs: e.g., broadcast communication, or “pure” direct democracy.

          • Naive cooperation designs O(n2), fail to scale.

        • Unreliable or misbehaving human participants: e.g., spam, trolling, sock puppetry, harassment, fake news, fake reviews.

          • The design of the platform and process greatly affects the behavior of its users - so how to design properly?

          • Who should “police” a community, and how?  Government, platform operator employees, community self-policing?

        • Unintended effects of algorithms: e.g., polarization, radicalization

          • Social recommendation/newsfeed systems, echo chamber

          • Interaction-maximizing and click-maximizing algorithms and anger-exploiting media (e.g., YouTube).

        • Bias, both real and perceived, from self-selecting online groups

          • Tribalism, mistrust of information from other ideologies

          • Information relativism: lack of objective notion of “truth”

        • Overcompression of noisy, unreliable, manipulable information.

          • Large-scale campaigns, “beauty contest” elections.

          • Marketing-driven vs “efficient” markets, lemon markets.

          • Accountability of large [government] bureaucracies to electorate via the bottleneck of representation.

        • The competence- and expertise-identification problem

          • Populace choosing experts: may be poor judges

          • Experts choosing experts: unaccountable, may gradually become insular and fragmentation-prone.

        • Funding/incentivizing the production, curation of information

          • Free information vs monetization, copyright, DRM

          • Social media undermining traditional business models

          • Who bears the cost of information review and policing?

            • Users?  Private platform operators?  Governments?

        • Use and misuse of AI in democratic societies

          • How much should we let (powerful) algorithms do our thinking for us?  What are the benefits and risks?

      • Implementation flaws and challenges:

        • Network security, availability, resilience; “routing around” failures.

          • The upsides (resilience) and downsides (spam)

        • Identity and personhood problem: weak or strong digital identities; security, privacy, and coercion, Sybil and sock-puppet attacks.

          • The tension between anonymity (for freedom) versus accountability (for civility).

          • Individual misbehavior versus Sybil amplification.

        • Software bugs, exploitable flaws.

          • Bitcoin wallets as a universal bug bounty.

          • Smart contract bugs: e.g., DAO, Fomo3D attacks.

        • Transparency issues: should people trust it?  Will they?

        • Fairness challenges:

          • Investment-centric versus democratic stake models, the problem of “rich get richer” and power centralization

            • Corporations versus governments, foundations

            • Proof-of-work/stake/etc versus PoP

        • Inclusion challenges:

          • Digital divide: access to/expense of devices, connectivity.

          • Deliberate exclusion: disenfranchisement, guardianship.

          • Effective exclusion via trolling, harassment.

          • Identity attributes, personal data, coercion, black market.

    • Goals: What would we like to achieve?  What might our long-term end goals be?

      • Technology enabling and incentivizing civil bottom-up self-organization.

      • Ensure freedoms of expression and association but with accountability.

      • Enable attention-limited users to obtain the (good) information they need to participate fully as time and interests permit, avoiding polarization.

      • Give communities not just voting but effective and scalable deliberation, agenda-setting, and long-term governance evolution processes.

    • Particular social/technical tools and techniques this course will explore:

      • Basic fault analysis techniques applying to failures and compromises.

      • Network/graph algorithms for communication, and social/trust networks.

      • Content filtering, recommendation, curation, and peer review processes.

      • Social choice: election methods, sampling, direct and liquid democracy.

      • Deliberation processes: peer review, online deliberation, agenda-setting

      • Techniques to achieve privacy and anonymity with accountability: e.g., verifiable shuffles, anonymous credentials, zero-knowledge proofs, PoP.

    • Deployed technologies we will look at, exploring their promise and flaws:

      • USENET (historical) and other public discussion forums.

      • News media: both traditional and social media platforms.

      • Deliberation: applications like Loomio and Liquid Feedback

      • Peer review: Slashdot, Reddit, GitHub, HotCRP

      • Cryptocurrencies and smart contracts: Bitcoin, Ethereum.

      • Organizations and decentralized autonomous organizations (DAOs).

  • Course logistics

    • TAs introduction

    • Content: Readings, lectures, quizzes, occasional hands-on exercises

      • Some may involve light programming in various languages, but not programming-intensive.

    • Sessions: approx 5 hours per week total expected time commitment

      • Lecture: attendance normally expected and required

        • Will make provisions to ensure that missing one or two won’t seriously hurt grade, provided you catch up via colleagues

        • Discussion-oriented, no slides

        • Will stream (Zoom) and record lectures

          • Not a regular substitute for in-person attendance!

          • May not capture everything that happens in lectures

          • Instructor’s notes will be posted after lectures

          • Conflicting lecture?  Make sure you can normally attend this one (with few exceptions, e.g., midterm exams)

      • Exercises: need to attend only as directed in assigned exercises

        • TAs will present to help with assigned exercises

      • Practical work: purely-optional time/space for reading, discussion

        • Instructors will not be present

    • Grading

      • 10% - quizzes and lecture participation during semester

        • Last year: had 5 quizzes, lowest score dropped

      • 40% - exercises during semester

        • Last year: 4 assignments, the last in 3 phases

          • Exercise 1, USENET: 5%

          • Exercise 2, HotCRP: 10%

          • Exercise 3, DAOs: 5%

          • Exercise 4, Loomio: 20%

      • 50% - written final exam during exam session

    • Workload expectations:

      • Lecture participation expected (and graded)

      • Significant time required for readings, exercises

    • Grading approach: necessarily fuzzy and subjective, but:

      • Grading is mostly based on whether it’s clear that you “did the work” or not, and not on the basis of whether you arrived at a prescribed answer - so just do the work, and you’ll do fine.

      • The same instructor grades all students for a given part of a given exercise, ensuring that the subjective grading is consistent and fair across the class.

      • Not graded on a curve: no one needs to fail the course, and everyone can and *will* get at least a solid passing grade if they do the required work consistently.

    • All students are expected to do their own work!  Plagiarism is not tolerated.

    • Office hours

      • Mondays 16:00-16:30

  • For next lecture:

    • Readings

    • Mini-assignment due Monday: One reading suggestion (news item, magazine or scholarly article, etc) that you saw or found recently that you think is particularly insightful and potentially relevant to this course in any way.  Submit URL and a brief 1-sentence summary of why you found it interesting and relevant.


Last modified: Thursday, 28 September 2023, 11:52