Яανeη对《Automating Inequality》的笔记(3)

Яανeη
Яανeη (熬夜不仅会秃顶还会长胡子)

读过 Automating Inequality

Automating Inequality
  • 书名: Automating Inequality
  • 作者: Virginia Eubanks
  • 副标题: How High-Tech Tools Profile, Police, and Punish the Poor
  • 页数: 272
  • 出版社: St. Martin's Press
  • 出版年: 2018-1-23
  • -

    Technology can be a neutral tool. But, as Virginia Eubanks describes in “Automating Inequality,” the computerized tools applied to social service provision are designed with the institutional biases endemic in our society,

    cathartic

    starting with the idea that poverty is the fault of poor people and that a goal of our welfare systems is to make sure that nobody gets aid who doesn’t deserve it, even if that means denying aid to people who do.

    Eubanks calls the use of technology to evaluate and track poor people the “digital poorhouse,” consistent with efforts throughout U.S. history to distinguish between the “deserving” and “undeserving” poor.

    • Robert Moses-Southern State Parkway

    *efficiency and quantity, fairness*

    Digital poorhouse -

    • Quincy: impotent poor (disability who are “entitled” to be poor/deserving poor) and able poor(people who are just lazy-shamed)
    • Crisis homeless and chronic homeless
      • Only aim to help the crisis homeless and provide short term support
      • Cordinated entry system only take the information from homeless people and put these information into dataset without providing any actual help (what’s the purpose of collecting these data?) service providers would share those info with law enforcement, again the idea of stigmatize the poor
    • Similar example: beijing evict tens of thousands migrant workers, demolish their home that deemed to be dangerous or illegal. The city government says they are being pushed out for their own safety. But many migrants say the government is using the fire as an excuse to ramp up efforts to drive them out.

    Poor people are lazy-this idea persistent-

    Removing human agency so people would blame the algorithm, not government/inability/competency(blame algorithm, that’s what government want us to think)

    2020-02-12 01:15:05 回应
  • ^

    Forty years ago, nearly all of the major decisions that shape our lives—whether or not we are offered employment, a mortgage, insurance, credit, or a government service—were made by human beings. They often used actuarial processes that made them think more like computers than people, but human discretion still ruled the day. Today, we have ceded much of that decision-making power to sophisticated machines. Automated eligibility systems, ranking algorithms, and predictive risk models control which neighborhoods get policed, which families attain needed resources, who is short-listed for employment, and who is investigated for fraud.

    But that’s the thing about being targeted by an algorithm: you get a sense of a pattern in the digital noise, an electronic eye turned toward you, but you can’t put your finger on exactly what’s amiss. There is no requirement that you be notified when you are red-flagged.

    We all inhabit this new regime of digital data, but we don’t all experience it in the same way. What made my family’s experience endurable was the access to information, discretionary time, and self-determination that professional middle-class people often take for granted.

    Poor people don't have these resources

    Not everyone fares so well when targeted by digital decision-making systems. Some families don’t have the material resources and community support we enjoyed. Many don’t know that they are being targeted, or don’t have the energy or expertise to push back when they are.

    Big Brother is not watching you, he’s watching us. Most people are targeted for digital scrutiny as members of social groups, not as individuals. People of color, migrants, unpopular religious groups, sexual minorities, the poor, and other oppressed and exploited populations bear a much higher burden of monitoring and tracking than advantaged groups.

    这个不就是人分三六九等嘛,没有了算法大家就不分了吗?也还是分的好吧

    Marginalized groups face higher levels of data collection when they access public benefits, walk through highly policed neighborhoods, enter the health-care system, or cross national borders. That data acts to reinforce their marginality when it is used to target them for suspicion and extra scrutiny. Those groups seen as undeserving are singled out for punitive public policy and more intense surveillance, and the cycle begins again. It is a kind of collective red-flagging, a feedback loop of injustice.

    e.g. indigenous people (have less resource to healthcare) and black people(have higher chance to get searched on the street).

    The proposed laws were impossible to obey, patently unconstitutional, and unenforceable, but that’s not the point. This is performative politics. The legislation was not intended to work; it was intended to heap stigma on social programs and reinforce the cultural narrative that those who access public assistance are criminal, lazy, spendthrift addicts.

    Stigmatize the poor

    high-tech economic development was increasing economic inequality in my hometown, intensive electronic surveillance was being integrated into public housing and benefit programs, and policy-makers were actively ignoring the needs and insights of poor and working people.

    Massive investments in data-driven administration of public programs are rationalized by a call for efficiency, doing more with less, and getting help to those who really need it.

    “They’re great. Except [Social Services] uses them as a tracking device.” I must have looked shocked, because she explained that her caseworker routinely looked at her purchase records. Poor women are the test subjects for surveillance technology,

    Dorothy’s insight was prescient. The kind of invasive electronic scrutiny she described has become commonplace across the class spectrum today. Digital tracking and decision-making systems have become routine in policing, political forecasting, marketing,

    Across the country, poor and working-class people are targeted by new tools of digital poverty management and face life-threatening consequences as a result. Automated eligibility systems discourage them from claiming public resources that they need to survive and thrive. Complex integrated databases collect their most personal information, with few safeguards for privacy or data security, while offering almost nothing in return. Predictive models and algorithms tag them as risky investments and problematic parents. Vast complexes of social service, law enforcement, and neighborhood surveillance make their every move visible and offer up their behavior for government, commercial, and public scrutiny.

    Automated decision-making shatters the social safety net, criminalizes the poor, intensifies discrimination, and compromises our deepest national values. It reframes shared social decisions about who we are and who we want to be as systems engineering problems. And while the most sweeping digital decision-making tools are tested in what could be called “low rights environments” where there are few expectations of political accountability and transparency, systems first designed for the poor will eventually be used on everyone.

    The digital poorhouse deters the poor from accessing public resources; polices their labor, spending, sexuality, and parenting; tries to predict their future behavior; and punishes and criminalizes those who do not comply with its dictates. In the process, it creates ever-finer moral distinctions between the “deserving” and “undeserving” poor, categorizations that rationalize our national failure to care for one another.

    没看懂,这个例子啥关联啊

    Quincy genuinely wanted to alleviate suffering, but he believed that poverty was a result of bad personal habits, not economic shocks.

    "poor people are poor because they are lazy" ?

    The impotent poor, he wrote in 1821, were “wholly incapable of work, through old age, infancy, sickness or corporeal debility,” while the able poor were just shirking.

    From the beginning, the poorhouse served irreconcilable purposes that led to terrible suffering and spiraling costs. On the one hand, the poorhouse was a semi-voluntary institution providing care for the elderly, the frail, the sick, the disabled, orphans, and the mentally ill. On the other, its harsh conditions were meant to discourage the working poor from seeking aid.

    Many of the institution’s daily operations could thus be turned into side businesses: the keeper could force poorhouse residents to grow extra food for sale, take in extra laundry and mending for profit, or hire inmates out as domestics or farm-workers.

    still exist now in prison system, prisoners as free labour without benefits.

    Scientific charity argued for more rigorous, data-driven methods to separate the deserving poor from the undeserving. In-depth investigation was a mechanism of moral classification and social control. Each poor family became a “case” to be solved; in its early years, the Charity Organization Society even used city police officers to investigate applications for relief. Casework was born.

    e.g. 北京清理低端人口

    eugenics practitioners quickly turned their attention to eliminating what they saw as negative characteristics of the poor: low intelligence, criminality, and unrestricted sexuality.

    Eugenics???Hitler!!!

    Eugenics created the first database of the poor.

    so, should we blame algorithm, or should we blame people that is discriminate/biased towards those people?

    Eugenics was intended to cleanse the race from within by shining a clinical spotlight on what Dr. Albert Priddy called the “shiftless, ignorant, and worthless class of anti-social whites of the South.”

    The movement blended elite anxieties about white poverty with fears of increased immigration and racist beliefs that African Americans were innately inferior.

    If the poorhouse was a machine that diverted the poor and working class from public resources, scientific charity was a technique of producing plausible deniability in elites.

    The design of New Deal relief policies reestablished the divide between the able and the impotent poor.

    The able poor were still white male wage workers thrown into temporary unemployment.

    The impotent poor were still those who faced long-term challenges to steady employment: racial discrimination, single parenthood, disability, or chronic illness. But they were suddenly characterized as undeserving, and only reluctantly offered stingy, punitive, temporary relief.

    Public assistance programs were less generous because benefit levels were set by states and municipalities, not the federal government. They were more punitive because local and state welfare authorities wrote eligibility rules and had financial incentive to keep enrollments low. They were more intrusive because income limits and means-testing rationalized all manner of surveillance and policing of applicants and beneficiaries.

    In distinguishing between social insurance and public assistance, New Deal Democrats planted the seeds of today’s economic inequality, capitulated to white supremacy, sowed conflict between the poor and the working class, and devalued women’s work.

    Elected officials and state bureaucrats, caught between increasingly stringent legal protections and demands to contain public assistance spending, performed a political sleight of hand. They commissioned expansive new technologies that promised to save money by distributing aid more efficiently.

    In fact, these technological systems acted like walls, standing between poor people and their legal rights. In this moment, the digital poorhouse was born.

    The advocates of automated and algorithmic approaches to public services often describe the new generation of digital tools as “disruptive.” They tell us that big data shakes up hidebound bureaucracies, stimulates innovative solutions, and increases transparency. But when we focus on programs specifically targeted at poor and working-class people, the new regime of data analytics is more evolution than revolution. It is simply an expansion and continuation of moralistic and punitive poverty management strategies that have been with us since the 1820s.

    moving away from face-to-face casework and toward electronic communication would make offices more organized and more efficient.

    The problem with the existing caseworker-centered system, as they saw it, was twofold. First, caseworkers spent more time manually processing papers and collecting data than “using their social work expertise to help clients.” Second, the outdated data system allowed caseworkers to collude with outside co-conspirators to illegally obtain benefits and defraud taxpayers.

    Failure to cooperate notices offered little guidance. They simply stated that something was not right with an application, not what specifically was wrong. Was a document missing, lost, unsigned, or illegible? Was it the fault of the client, the FSSA, or the contractor?

    The goals of the project were consistent throughout the automation experiment: maximize efficiency and eliminate fraud by shifting to a task-based system and severing caseworker-to-client bonds. They were clearly reflected in contract metrics: response time in the call centers was a key performance indicator; determination accuracy was not. Efficiency and savings were built into the contract; transparency and due process were not.

    this idea of efficiency keeps coming back and repeating, is it the real issue? the issue of quantity? is it really efficient? more importantly, efficient to whom?

    The state now uses the hybrid eligibility system, which combines face-to-face interactions with public employees with the electronic data processing and privatized administration of the automated system. Its design allows applicants to contact a team of regional caseworkers assigned to their case by phone, by internet, by mail, or in person, providing increased contact with state workers. But the hybrid system still relies on privatized, automated processes for many core functions and retains the task-based case management that caused so many problems during the modernization.

    Automated decision-making can change government for the better, and tracking program data may, in fact, help identify patterns of biased decision-making. But justice sometimes requires an ability to bend the rules. By removing human discretion from frontline social servants and moving it instead to engineers and private contractors, the Indiana experiment supercharged discrimination.

    The “social specs” for the automation were based on time-worn, race- and class-motivated assumptions about welfare recipients that were encoded into performance metrics and programmed into business processes: they are lazy and must be “prodded” into contributing to their own support, they are sneaky and prone to fraudulent claims, and their burdensome use of public resources must be repeatedly discouraged. Each of these assumptions relies on, and is bolstered by, race- and class-based stereotypes.

    2020-04-19 00:36:05 回应
  • ~

    The “social specs” for the automation were based on time-worn, race- and class-motivated assumptions about welfare recipients that were encoded into performance metrics and programmed into business processes: they are lazy and must be “prodded” into contributing to their own support, they are sneaky and prone to fraudulent claims, and their burdensome use of public resources must be repeatedly discouraged. Each of these assumptions relies on, and is bolstered by, race- and class-based stereotypes.

    New high-tech tools allow for more precise measuring and tracking, better sharing of information, and increased visibility of targeted populations. In a system dedicated to supporting poor and working-class people’s self-determination, such diligence would guarantee that they attain all the benefits they are entitled to by law. In that context, integrated data and modernized administration would not necessarily result in bad outcomes for poor communities. But automated decision-making in our current welfare system acts a lot like older, atavistic forms of punishment and containment. It filters and diverts. It is a gatekeeper, not a facilitator.

    the Indiana automation experiment was a form of digital diversion for poor and working Americans. It denied them benefits, due process, dignity, and life itself.

    Those facing crisis homelessness tend to be experiencing “short-term emergencies [such as] eviction, domestic violence, sudden illness, or job loss, or reentering communities after incarceration.”4 The crisis homeless, Culhane argues, often self-correct: after a short stay in a shelter, they identify family members they can stay with, access new resources, or move away. A small, time-limited investment can offer them “a hand up to avoid the downward spiral” into chronic homelessness.

    There was a mismatch between needs and resources: the crisis homeless got resources most appropriate for the chronically homeless; the chronically homeless got nothing at all.

    The survey also collects protected personal information: social security number, full name, birth date, demographic information, veteran status, immigration and residency status, and where the respondent can be found at different times of day. It collects domestic violence history. It collects a self-reported medical history that includes mental health and substance abuse issues. The surveyor will ask if it is OK to take a photograph.

    According to the system’s designers and funders, coordinated entry upends the status quo in homeless services that privileged stronger clients. It builds new, deeper bonds between service providers throughout Los Angeles, leading to increased communication and resource sharing. It provides sophisticated, timely data about the nature of the housing crisis that can be used to shape more responsive policy-making. But most crucially, by matching homeless people to appropriate housing, it has the potential to save the lives of thousands of people.

    “There is this pressure to stretch every dollar as far as you can, to make sure that you’re being as absolutely efficient and effective as possible.

    Rapid re-housing is aimed at the crisis homeless. Coordinated entry in Los Angeles, which initially focused on getting the most vulnerable unhoused people into permanent supportive housing, now aims to match the newly homeless with short-term support. That leaves those in the middle—too healthy to qualify for a rare unit of permanent supportive housing but out on the street far too long to make a major change with the limited resources of rapid re-housing—out in the cold.

    “Coordinated entry system? The system that’s supposed to be helping the homeless? It’s halting the homeless. You put all the homeless people in the system, but they have nowhere for them to go. Entry into the system but with no action.

    There is no requirement that the information released be limited in scope or specific to an ongoing case. There is no warrant process, no departmental oversight, no judge involved to make sure the request is constitutional.

    In many neighborhoods, community policing is preferable to reactive, incident-driven law enforcement. But it also raises troubling questions. Community policing casts officers as social service or treatment professionals, roles for which they rarely have appropriate training. It pulls social service agencies into relationships with police that compromise their ability to serve the most marginalized people, who often have good reason to avoid law enforcement. Police presence at a social service organization is sufficient to turn away the most vulnerable unhoused, who might have outstanding warrants for status crime tickets associated with being homeless.

    Further integrating programs aimed at providing economic security and those focused on crime control threatens to turn routine survival strategies of those living in extreme poverty into crimes. The constant data collection from a vast array of high-tech tools wielded by homeless services, business improvement districts, and law enforcement create what Skid Row residents perceive as a net of constraint that influences their every decision.

    To understand coordinated entry as a system of surveillance, it is crucial to differentiate between “old” and “new” surveillance.

    in new data-based surveillance, the target often emerges from the data. The targeting comes after the data collection, not before. Massive amounts of information are collected on a wide variety of individuals and groups. Then, the data is mined, analyzed, and searched in order to identify possible targets for more thorough scrutiny.

    Surveillance is not only a means of watching or tracking, it is also a mechanism for social sorting. Coordinated entry collects data tied to individual behavior, assesses vulnerability, and assigns different interventions based on that valuation.

    If homelessness is inevitable—like a disease or a natural disaster—then it is perfectly reasonable to use triage-oriented solutions that prioritize unhoused people for a chance at limited housing resources. But if homelessness is a human tragedy created by policy decisions and professional middle-class apathy, coordinated entry allows us to distance ourselves from the human impacts of our choice to not act decisively. As a system of moral valuation, coordinated entry is a machine for producing rationalization, for helping us convince ourselves that only the most deserving people are getting help. Those judged “too risky” are coded for criminalization. Those who fall through the cracks face prisons, institutions, or death.

    2020-04-19 00:38:16 回应

Яανeη的其他笔记  · · · · · ·  ( 全部304条 )

长不大的父母
1
治愈隐性虐待
1
蛤蟆先生去看心理医生
1
无后为大
1
Men Explain Things to Me
3
警惕你身边的隐形攻击者
1
中国女孩
1
假性亲密关系
1
小狗钱钱
1
如何不喜欢一个人
1
不吃糖的理由:上瘾、疾病与糖的故事
1
Digital Health
1
生吞
1
我想要两颗西柚
1
The Network Society
1
冷暴力
1
鞋带
1
粉红牢房效应
1
不要和你妈争辩
1
我为什么不结婚
1
100个基本
1
抵错
1
皮肤的秘密
1
The Means of Reproduction
3
Algorithms of Oppression
1
Automating the News
1
The Leadership Experience
1
Responding to Community Outrage
1
The Handbook of Crisis Communication (Handbooks in Communication and Media)
1
Antifragile
1
Weapons of Math Destruction
1
Comedy and Social Science
1
Media Technologies
2
Communication Matters
3
What Algorithms Want
1
Feminist Approaches to Media Theory and Research
2
The Handbook of Gender, Sex and Media
1
The SAGE Handbook of Social Media
1
The Culture of Connectivity
1
The Bias of Communication
1
Understanding Media
2
Seeing Ourselves Through Technology
1
Dialectic of Enlightenment
1
Illuminations
1
Literature and Mass Culture
1
Canonic Texts in Media Research
1
Excellence in Public Relations and Communication Management
2
Speaking into the Air
1
The Marvelous Clouds
1
Consumer Identities
1
The Invisible Orientation
1
Qualitative Researh in Action
5
What I Wish I Knew When I Was 20
1
情感勒索
2
局外人
1
Researching the Public Opinion Environment
1
Communication Planning
6
所以,一切都是童年的错吗?
1
影响力
1
毛线君的理想国
1
原生家庭
4
Readings from Emile Durkheim
1
活下去的理由
1
小偷家族
1
不乖
3
巨婴国
5
披着羊皮的狼
1
不成熟的父母
7
回答不了
1
兔子什么都知道
2
为何家会伤人(升级版)
3
謝謝你
11
Milk and Honey
1
Society of the Spectacle
3
The History of Sexuality, Vol. 1
4
Discipline & Punish
1
Social Theory: Classical and Contemporary
10
心理学与生活
1
Marx
1
Propaganda
7
Marx and Marxism
1
Economic & Philosophic Manuscripts of 1844
1
social theory
4
Out of Focus
3
The Anthropology of Religion, Magic, and Witchcraft
6
Listen up
7
Psychology of women
1
Quantitative Research Methods for Communication
7
The encyclopedia of modern witchcraft and neo-paganism
6
Getting to Yes
5
Ordinarily Well
1
Ongoing Crisis Communication: Planning, Managing, and Responding
6
公正
2
社会性动物
3
一九八四
5
Ancient Greek Myths
1
男人这东西
1
拖延心理学
7
灿烂千阳
2
Family Matters An Introduction to Family Sociology in Canada
10
Critical Theory
1
蔡康永的说话之道
1
以你现在的努力程度,还轮不到拼天赋
1
金赛性学报告
6
Research methods in communication
2
Wheelock's Latin
2
爱你就像爱生命
1
Mass Communication in Canada
2
人间失格
1
失乐园
1
Thirty-Eight Latin Stories Designed to Accompany Wheelock's Latin
4
超级漫画素描技法
1
看见
14
人間失格(上,下)
2
5
别让小情绪害了你
2
乖,摸摸头
1
性学三论
2
Psychology (11th Edition)
1
被窝是青春的坟墓
2
A Wizard of Earthsea
1
The harbrace anthology of short fiction
4
THINKING ABOUT SOCIOLOGY: A CRITICAL INTRODUCTION
1
目送
1
少年的你,如此美丽
1
Sociology My Compass for a New World
1
Plato: Republic
2
Survivor
1
第三种爱情
1
相思长梦河
4
边缘少年
1
人間失格 02
1
忍冬
1
那个不为人知的故事
2