Why this article is important: As the AI evolution seeps into every service offered by providers in real estate transactions, one must remember that responsibility for AI-generated content ultimately comes to rest with the provider — brokers and agents. The DRE has some tips for staying compliant when AI enters the equation.
AI in real estate
Artificial intelligence, or AI, is everywhere — just this week, news broke of how homeowners are using ChatGPT (one of the more popular AI-powered chat bots) to help market the sale of their homes.
Rather than get left behind by the AI evolution, savvy agents are embracing AI to enhance their practice. Because AI is affecting the behavior of some brokers and agents, the California Department of Real Estate (DRE) is weighing in on how real estate licensees need to analyze the risk presented by its use.
In a recent licensee advisory, the DRE cautions agents who incorporate AI into their practice to remain compliant with real estate agency law. (Surprise, AI is a bit under-trained on the principles licensees must adhere to).
Specifically, the DRE notes real estate licensees are using AI in:
- advertising, including crafting property for sale descriptions, generating social media posts to solicit clientele and even enhancing photos to attract buyers to properties for sale;
- lead generation by managing website inquiries, often through the use of chatbots;
- property valuation, specifically by automating valuation models a la Zillow’s controversial Zestimate®;
- transaction and document review, including summarizing important documents, tracking deadlines and reviewing documents for errors;
- property management, by screening tenants, evaluating rent prices and making maintenance predictions; and
- mortgage servicing, including using AI to analyze borrower behavior by calling on historical data to ultimately reduce defaults.
While AI can make a real estate agent more efficient by eliminating some of their more mundane tasks, the DRE cautions against complacency and unchecked reliance. AI does not yet know that bringing in other ideas makes its output incompatible with the rules of conduct for agents in real estate transactions.
Related article:
AI will forever need a supervisor
For example, when an AI-powered tool generates inaccurate, wrong or harmful information, who is on the hook? Can agents shift liability to AI by giving notice that AI occasionally gets it wrong?
The supervising broker is always responsible for activities of their agents and broker-associates when they act within the scope of their employment with the broker. In turn, activities as an agent led astray in reliance by AI to the financial detriment of others become a liability exposure for the broker.
It’s important to remember that AI is only correct some of the time — it is never foolproof. Therefore, AI-generated materials need to be proofread and fact-checked for compliance with California rules and property information — always.
Further, brokers need to caution their agents against using AI for advice on decisions about services and activities that require a license.
In their advisory, the DRE draws a comparison between asking AI-powered tools to complete tasks requiring a license to having an unlicensed assistant perform duties requiring a license — unlawful.
For example, consider a property owner’s leasing agent who is a licensed real estate broker. The broker installs an AI chatbot on their website to answer tenant-related inquiries. The instructions they give the chatbot are fairly vague, limited to facts about the rental properties they manage and a way for the chatbot to book appointments for the agent to show property to interested tenants.
However, after the chatbot is installed and has been in use on the website, the broker discovers the chatbot has also been discussing pricing, terms and conditions of the broker’s rental properties they manage, in AI-speak called “bleed” as not within the authority given to the chatbot.
Here, the broker is liable for allowing their “unlicensed assistant” — the chatbot — to conduct activity requiring a license — discussing the pricing, terms and conditions of the rental property. Even though the broker did not instruct the chatbot to do so, the broker is on the hook for not providing enough supervision (e.g., checking to ensure the chatbot was sharing accurate, lawful information suitable to be shared by an unlicensed broker assistant).
Compliance check
Consider a broker refreshing their advertising materials. They do not consider themselves creative and therefore turn to ChatGPT to help with their next advertising campaign.
The chatbot produces elegant copy the broker uses, and the broker plugs this into a graphic ad generator, also powered by AI.
The entire process takes under an hour and the broker, happy with the results, posts the ad to their social media pages.
In their delirium, the broker missed the most important step in advertising: remaining compliant with real estate law.
Advertisements that may constitute a first contact with a consumer, whether they appear on a website, in print or on social media, are required to provide the agent’s full name, DRE license number, Nationwide Mortgage Licensing System (NMLS) ID number (when applicable) and their responsible broker’s identity. [Calif. Business & Professions Code §10140.6]
Related article:
Consider a seller agent marketing a home for sale. The property is vacant, but the agent knows that presenting images of the property furnished will help potential buyers imagine how they will use the space.
The seller is unwilling to pay for staging, so the agent turns to AI for help. They use an AI-powered image generator to update some of the photos with furnishings. They also realize the property shows nicer in other lights, so they adjust the exposure.
As their final step before publishing the photos on the multiple listing service (MLS), the agent must disclose they digitally manipulated the images.
Since the agent digitally staged the property, they need to disclose:
- when an altered image is not the original; and
- instructions to view the original. [BPC §10140.8]
However, these disclosure rules do not apply to images subject to common photo editing that do not change the physical property itself, such as correcting the:
- lighting;
- white balance;
- angle; or
- exposure. [BPC §10140.8(a)(2)]
Any altered image may not change material aspects of the physical property without also presenting the true and unaltered photo. [BPC §10140.8(a)(1)]
These changes include but aren’t limited to adding or removing the:
- fixtures;
- furniture;
- appliances;
- flooring;
- walls;
- paint color; or
- landscape. [BPC §10140.8(b)(1)]
Related article:
Finally, a licensee using AI to generate copy needs to double check the AI-generated content to ensure compliance with state and federal anti-discrimination laws.
For example, consider an agent who directs an AI tool to generate a property description the agent will post when advertising the property as available for sale, such as on an MLS. The agent types in some key property facts and lets AI do the rest.
However, the agent needs to check the description for both accuracy and compliance.
In this case, the agent might have typed in “updated kitchen,” but the AI tool elaborated by describing granite counter tops and stainless-steel appliances… when in fact, the countertops are Carrera marble and the appliances are white.
Further, while the agent told the AI tool that it was a “nice neighborhood,” the AI tool took this a step further and said the neighborhood was a “quiet neighborhood, very safe.” Here, the word “quiet” may be taken to work as excluding families with children (a protected class). The word “safe” is also a red flag, since it places liability on the agent and their broker for what a homebuyer may or may not consider “safe.”
Property valuation
Remember how mad agents used to get about the prices Zillow posts on property available for sale? Zillow uses an algorithm to provide Zestimates® of property, usually to the frustration of agents with more factual knowledge a property’s characteristics, neighborhood and market environments. Agents have claimed these Zestimates® are misleading to sellers and buyers alike.
Now, consider a seller broker tasked with presenting a sales price to their seller client. When the broker turns to AI to produce a figure, they are just as bad (maybe more so, since their price comes with the pledge of an experienced broker) as any online MLS inventory aggregator.
AI can be a helpful place to start when valuing a property, but the broker’s work never ends there. Even the most sophisticated AI valuation tool will not consider:
- the broker’s insights into local market trends;
- how comparable homes for sale are performing on the market;
- property soon coming on the market, of which the broker has knowledge;
- what improvements the seller might consider making to improve the price; and
- a myriad of other factors unique to the property and surrounding area.
Related article:
AI is quickly becoming an essential tool to tech-savvy agents. But technology cannot and will not replace the experience and knowledge of a real-life real estate licensee since AI does not have the training or exposure. At the end of the day, it’s your license on the line for misrepresentation and licensing violations, not AI’s.









