By Marisa Kendall, CalMatters
This story was originally published by CalMatters. Sign up for their newsletters.
As AI expands into every facet of society, a California company is testing whether the technology can help improve the health of people living on the streets.
Akido Labs, a Los Angeles-based health care technology company that runs clinics and street medicine teams in California, plans to start using its AI model on homeless and housing insecure patients in the Bay Area next month. The program generates questions for outreach workers to ask patients and then suggests diagnoses, medical tests and even medication, which a human doctor then signs off on remotely. The idea is to save doctors time and allow them to see more patients.
The new model, called Scope AI, is addressing a very real problem: There aren’t nearly enough doctors visiting encampments and shelters. At the same time, homeless Californians are in much poorer health and are dying earlier than the general population.
“There are individuals who haven’t seen doctors for years. There are individuals who haven’t seen a dentist ever,” said Steve Good, president and CEO of Five Keys, which is partnering with Akido to launch the AI technology in its San Francisco homeless shelters. “There just aren’t enough resources to go in there and find out the needs these individuals have.”
Experts who research AI told CalMatters that if done right, the technology has the power to increase access to care for homeless and other marginalized communities. But while many health care providers already are using AI for administrative duties, such as transcribing patient visits, using it to help diagnose people is still a relatively new field. It brings up concerns around data privacy, biases and patient outcomes, which are particularly pressing when the technology is being used on homeless patients and other vulnerable groups.
“We don’t have perfect solutions to a lot of these challenges yet,” said Angel Hsing-Chi Hwang, an assistant professor at USC who researches human-AI interaction.
How Scope uses AI to diagnose homeless patients
Scope AI essentially allows non-medically trained outreach workers to start the intake and diagnosis process before a patient sees a doctor.
An outreach worker goes out into the field with Scope on their tablet or laptop. As they start interviewing a patient, Scope suggests questions the outreach worker should ask. Scope listens to, records and transcribes the interview, and as the interaction progresses, it suggests new questions based on what the patient says.
When it has enough information, Scope suggests diagnoses, prescriptions and follow-up tests. That information is then sent to a human doctor, who reviews it (usually the same day) and either signs off on the prescriptions, makes changes, or, if it’s a more complex case, arranges to see the patient to get additional information. The medical care is paid for by Medi-Cal through its CalAIM expansion into social services.
In demonstrating the technology to CalMatters, using an imaginary 56-year-old female patient who complained of trouble breathing, Scope asked several follow-up questions to drill down on her symptoms. Then, it made suggestions that included: a diagnosis of COPD or chronic bronchitis, a chest x-ray and spirometry breathing test, and a prescription of an albuterol inhaler.
The Scope AI technology is already being used in a few target areas. Akido’s street medicine teams began using it in homeless encampments in Los Angeles County in 2023, where it has since seen more than 5,000 patients. Akido also uses AI in encampments in Kern County, clinics in California and Rhode Island, and to treat ride-share workers in New York.
“I would say, in general, that this would not work for this population.”
Brett Feldman, director, USC Street Medicine
Scope lands on the correct diagnoses within its top three suggestions 99% of the time, according to Akido.
Other studies have called into question the reliability of diagnoses made by artificial intelligence. A 2024 study, for example, found that AI was significantly more likely to misdiagnose breast cancer in Black women than in white women.
The infiltration of AI into homeless services has sparked concern from some critics who argue homeless patients, because of their increased vulnerability, need a human health care provider.
“We should not experiment on patients who are unhoused or have low incomes for an AI rollout,” Leah Goodridge, a tenants rights attorney and housing policy expert, and Dr. Oni Blackstock, a physician and executive director of Health Justice, wrote in a recent opinion piece for the Guardian.
Brett Feldman, director of USC Street Medicine, agrees. When someone is homeless, much of their health status is dependent on their living environment, he told CalMatters. For example, he recently treated a patient with scabies. Typically, he would prescribe a shampoo or body wash, but this patient had no access to a shower — a key detail that AI might not know to ask.
Instead, he prescribed an oral medication. The patient needed one dose right away, and another dose in a week. He had to decide whether to give the patient the second dose now and trust that it wouldn’t get lost or stolen, ask the patient to travel to a pharmacy to pick up the second dose, or try to find the patient again in a week to deliver the dose. AI couldn’t make that complex calculation, and neither could a doctor who hadn’t met the patient and seen their living situation, Feldman said.
And any missteps the AI makes could have outsized consequences when a patient is homeless, Feldman said. If the patient has an issue with the medication prescribed, they likely don’t have an easy way to contact the doctor or have a follow-up appointment.
“I would say, in general, that this would not work for this population,” Feldman said.
Akido argues the benefit of AI is clear: better efficiency and improved access to health care.
Before introducing AI, each of Akido’s street medicine doctors in LA and Kern counties could carry a case load of about 200 homeless patients at a time, said Karthik Murali, head of safety net programs for the company. Now, it’s closer to 350 patients per doctor, he said, because doctors spend less time asking routine questions and filling out paperwork.
That means more patients get access to care and medication more quickly, Murali said.
Nearly a quarter of homeless Californians surveyed by the UCSF Benioff Homelessness and Housing Initiative reported needing medical care that they couldn’t get in the six months prior to the study. Only 39% said they had a primary care provider. Nearly half of homeless Californians surveyed reported their health as poor or fair — a rate about four times higher than the general U.S. population.
Good, of Five Keys, hopes the technology also will let clinicians build trust and deeper relationships with their clients. An outreach worker using Scope will have time to form a bond with the patient and better respond to their individual needs, as opposed to a doctor who is rushing through the visit to get to the next patient, he said.
His organization hopes to roll out the technology in some of its San Francisco homeless shelters next month.
Partnerships and access
Akido also plans to work with Reimagine Freedom and the Young Women’s Freedom Center to use the AI technology at four centers — in San Francisco, Oakland, Richmond and San Jose — that serve women and girls who are or have been incarcerated. The clients they serve often had poor access to health care while in jail or prison, or had their medical concerns ignored, said Reimagine Freedom President Jessica Nowlan. Many have no trust in the medical system.
Currently, the centers offer health education. This new AI technology will allow them to provide actual medical care, Nowlan said.
“Our guess is we will see a huge increase in women being able to access health and care for themselves,” she said.
Reimagine Freedom started testing Scope AI at its Los Angeles clinic in November. So far, “it’s going really well,” Nowlan said.
Akido plans to partner with additional homeless service providers who can help it roll out its AI technology in more places throughout the Bay Area.
If providers who serve vulnerable patients are left out of the AI race, any benefits in the technology will go to wealthy communities instead — further widening the gap between the haves and have-nots, said Stella Tran, who researches AI companies for a California Health Care Foundation investment fund. That’s why social service providers need to be involved in testing this technology and developing the ground rules and safety checks, she said.
But that doesn’t mean Tran doesn’t have concerns. For example, AI works differently on different communities. An algorithm that produced accurate diagnoses for patients in Los Angeles might not work as well in the Bay Area, she said. And while AI has the potential to be less racially biased than human doctors, it all depends on how the algorithm is constructed.
“I think there is a potential to increase access if we do it right,” Tran said, “with the right set of guardrails and being thoughtful about safety, transparency to patients, consent, all of that.”
This article was originally published on CalMatters and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

