Facial Monitoring: The All-Telling Eye

We know what you’re thinking

IMAGINE browsing a website when a saucy ad for lingerie catches your eye. You don’t click on it, merely smile and go to another page. Yet it follows you, putting up more racy pictures, perhaps even the offer of a discount. Finally, irked by its persistence, you frown. “Sorry for taking up your time,” says the ad, and promptly desists from further pestering. Creepy. But making online ads that not only know you are looking at them but also respond to your emotions will soon be possible, thanks to the power of image-processing software and the ubiquity of tiny cameras in computers and mobile devices.

Uses for this technology would not, of course, be confined to advertising. There is ample scope to deploy it in areas like security, computer gaming, education and health care. But admen are among the first to embrace the idea in earnest. That is because it helps answer, at least online, clients’ perennial carp: that they know half the money they spend on advertising is wasted, but they don’t know which half.

Advertising firms already film how people react to ads, usually in an artificial setting. The participants’ faces are studied for positive or negative feelings. A lot of research, some of it controversial, has been done into ways of categorising the emotions behind facial expressions. In the 1970s Paul Ekman, an American psychologist, developed a comprehensive coding system which is still widely used.

Some consumer-research companies also employ goggle-mounted cameras to track eye movements so they can be sure what their subjects are looking at. This can help determine which ads attract the most attention and where they might be placed for the best effect on a web page.

This work is now moving online. Higher-quality cameras and smarter computer-vision software mean that volunteers can work from home and no longer need to wear clunky headgear. Instead, their eyes can be tracked using a single webcam.

One of the companies doing such work, Realeyes, which is based in London, has been developing a system that combines eye-spying webcams with emotional analysis. Mihkel Jäätma, who founded the company in 2007, says that his system is able to gauge a person’s mood by plotting the position of facial features, such as eyebrows, mouth and nostrils, and employing clever algorithms to interpret changes in their alignment—as when eyebrows are raised in surprise, say. Add eye-movement tracking, hinting at which display ads were overlooked and which were studied for any period of time, and the approach offers precisely the sort of quantitative data brand managers yearn for.

At present the system is being used on purpose-built websites with, for instance, online research groups testing the effect of various display ads. The next step is to make interactive ads. Because they can spot the visual attention given to them, as well as the emotional state of the viewer, these ads could tailor their responses.

As similar gimmicks become widespread, privacy concerns will invariably mount. People would need to give consent to their webcams being used in this way, Mr Jäätma admits. One way to persuade internet users to grant access to their images would be to offer them discounts on goods or subscriptions to websites.

Realeyes is also working with Kaplan, an educational-services company, on a project in Hungary which is using the system to measure how children respond to virtual games that teach them English. The hope is that by performing the same emotion-reading trick that marketers use, the type of tasks and the characters that appear in them can be made more engaging.

The technology would make computer games more engaging, too. Sony, for one, thinks that reading players’ emotions with webcams would let software pick up on their subconscious behaviour and change the game in ways that would enhance the experience. The company claims that in the future it will be possible to have something like a detective game in which the camera can read players’ faces and measure their heart rates in order to have a stab at deciding which ones are lying.

In fact, webcams that monitor a person’s heart rate are soon to appear. Instead of sticking sensors onto the skin, Philips has developed a vital-signs camera system which the Dutch company says can measure heart and respiration rates extremely accurately. To calculate the heart rate the camera detects tiny changes in the colour of the skin. These changes, imperceptible to the human eye, occur as the heart pumps blood through the body. The person’s breathing rate is measured by detecting the rise and fall of his chest. The firm will soon launch an app for Apple’s iPad 2 which will allow people to measure their own heart and breathing rates using the two webcams in that device.

Philips is developing the technology as a contactless system to keep a virtual eye on hospital patients, such as newborn babies, who might find conventional monitors distressing. The company is also eyeing anxious parents who always want to know what their tots are up to, as well as anxious coaches and their athletes. Advertising firms will, no doubt, be just as keen to measure heartbeats, especially for ads designed to get pulses racing. Those who find it all smacks of Big Brother can turn their webcams off. If you are playing online poker, that is probably a wise idea.

Webcams can now spot which ads catch your gaze, read your mood and check your vital signs...

Firefox Takes Unusual Approach In Unveiling ‘Do Not Track’ Option

For all the talk among among policymakers and the press about online privacy, it still isn’t clear how much average consumers are even aware of online ad tracking. Firefox, the browser of choice for a third of all internet users, is apparently looking to change that. The beta of the latest version of Firefox trumpets the new “Do Not Track” feature prominently—listing it, in large font, as the very first item on the “What’s New in Firefox 4” page. The move could increase the pressure on other browser companies as well as advertisers to beef up their own privacy options. 

Mozilla announced months ago that it would put a Do Not Track option in the new version of Firefox—so in that sense, the release of the beta version isn’t a surprise. But what is unexpected is the headline “Opt Out of Ad Tracking” splashed across the company’s upgrade page.

What’s New in Firefox 4 Beta

Opt Out of Ad Tracking

Reality Check Ahead: Data Mining and the Implications for Real Estate Professionals

MLS is a 100-year old institution that expertly aggregates and houses most, if not all, of real estate’s most critical data. Today, our data is currently being leveraged, sourced, scraped, licensed and syndicated by a grand assortment of players, partners and members. It’s being utilized in ways never imagined just a decade ago. Or, for that matter, six months ago.

The result: a plethora of competitive, strategic, financial and security-based issues have surfaced that challenge every MLS, as well every single one of our members/customers.

I think about this all the time. During my recent visit with my son KB – a college junior – he told me about how Google recently came to his campus offering everyone free email, voice mail, Docs (to replace MS Office) and data storage – an impressive list of free services for all.

I asked him why this publically traded company would give away its products for free. Despite his soaring IQ and studies in information systems technology, he couldn’t come up with an answer.

Searching Google on my laptop I presented KB with the following Google customer email (September, 2009) that read: “We wanted to let you know about some important changes … in a few weeks, documents, spreadsheets and presentations that have been explicitly published outside your organization and are linked to or from a public website will be crawled and indexed, which means they can appear in search results you see on Google.com and other search engines.” Note: once data is available on Google searches, their business model calls for selling advertising around that search result.

Bear in mind this refers to published docs and not those labeled as private – a setting within Google Docs that of which not all users are aware.

I also presented him with the specific EULA (End-User Licensing Agreement) language that states how a user grants a “perpetual, irrevocable, royalty free license to the content for certain purposes (republication, publication, adaptation, distribution), extending to the provision of syndicated services and to use such content in provision of those services.”

 

I recounted for KB how back in March of 2010, we learned in the national news that: “A confidential, seven-page Google Inc. “vision statement” shows the information-age giant is in a deep round of soul-searching over a basic question: How far should it go in profiting from its crown jewels—the vast trove of data it possesses about people’s activities?”

Source: Wall Street Journal August 10, 2010

This chart above shows that nearly 85% of respondents are concerned about the practice of tracked online behavior by advertisers.

Then, a Wall Street Journal article titled “What They Know” was posted which discusses how companies are developing ‘digital fingerprint’ technology to track our use of individual computers, mobile devices and TV set-top boxes so they can sell the data to advertisers. It appears that each device broadcasts a unique identification number that computer servers recognize and, thus, can be stored in a database and later analyzed for monetization. This 3-minute video is a must-see!

By the way, they call this practice “Human Barcoding.” KB began to squirm. As we all should.

 

Data. Security. And real estate

So what do “innovative” data mining and monetization methods now in use by Google and others, mean to real estate – specifically the data aggregated by an MLS and then shared around the globe?

We all must first grasp what happens to listing data when it’s collected and syndicated into “the cloud”, as well as the human transaction interactions that follow from start to finish (and beyond, actually).

Second, we need to understand how business intelligence and analytics are being applied to the data generated by real estate transactions today. If there is a monetization to the data without the knowledge and permission of the rightful owner, then, potentially, agreements need to be negotiated (or renegotiated) and modified to get in step with today’s (and tomorrow’s) inevitable ways of doing business. I’m not in any way opposed to data mining per se, the issue at hand here is fair compensation for the data on which it is based.

Here’s why the latest developments regarding Google (and others) are vitally important:

 

  • The world of leveraging digital information is changing very rapidly. As businesses push harder and deeper in their quest to monetize data, information, bits/bytes and mouse clicks, we must establish clear and informed consent on who exactly owns the data, who should control it and how it should be monetized. Protecting OUR “crown jewels”, if you will.
  • What do you know about “Human Barcoding”? It’s time for industry leaders to research this new phenomenon and begin to establish the basis for an industry position as it pertains to residential real estate.
  • How do we, as an industry, determine the real value of data beyond the property-centric context? As true business intelligence and data mining progress in our industry, we need “comps” to build upon to derive a valuation model.
  • What exactly is the MLS’s role? Are we the “stewards” of the data (on behalf of our customers) that emanates from the property record and the subsequent transaction and electronic interactions between all the parties connected to it?  How should the MLS industry confront the challenge?

We all certainly remember when the national consumer portals planted their flag(s) on this industry and, by association, MLS territory. Their rationale then was that they would help drive “eyeballs” and traffic to the inventory. Indeed they have. But, looking back, it all came with a pretty steep price tag.

For example, referral fees were subsequently replaced with advertising revenues that more often than not started chipping away at the edges of the broker’s affiliated business models (mortgage, insurance, etc). Now, as a result, the margins of the business are perilously thin from a broker’s perspective.

The roots of the MLS began as a business to facilitate a fair distribution of commissions and compensation amongst brokers. It’s safe to say, dear Toto, that we are no longer in Kansas anymore. Given the digital landscape, where value can be derived in so many unique ways, the fact that others whose motives for increasing the value of the asset are potentially suspect, it’s critical that we convene right now to assert an intellectual lead on what is happening here, or at least make the conscious decision to step aside.

I’m sure there are many other questions and reasons why this is “mission critical” to us. But what I’ve offered, with the help of several really smart folks in the industry, provides a good starting point. We welcome all industry commentators on this topic. Thanks in advance for sharing ….

John L. Heithaus Chief Marketing Officer, MRIS (john.heithaus@mris.net)

Ps – a “tip of the hat” to Greg Roberston of Vendor Alley for starting us on this path after his excellent post “Inside Trulia’s Boiler Room”*. I also benefited mightily from the comments of David Charron of MRIS, Marilyn Wilson of the WAV Group and Marc Davison of 1000watt Consulting, and I extend my appreciation to them for sharing their perspectives.

* After this story ran, the You Tube video interview with a Trulia staffer was made “private” and is now inaccessible. Vendor Alley’s analysis of the video provides an excellent overview of the situation.

 

FTC Considers Do-Not-Track List 07/28/2010

The Federal Trade Commission is considering proposing a do-not-track mechanism that would allow consumers to easily opt out of all behavioral targeting, chairman Jon Leibowitz told lawmakers on Tuesday.

Testifying at a hearing about online privacy, Leibowitz said the FTC is exploring the feasibility of a browser plug-in that would store users' targeting preferences. He added that either the FTC or a private group could run the system.

Leibowitz said that while Web users on a no-tracking list would still receive online ads, those ads wouldn't be targeted based on sites that users had visited in the past.

Three years ago, a coalition of privacy groups including the World Privacy Forum, Center for Digital Democracy and Center for Democracy & Technology proposed that the FTC create a do-not-track registry, similar to the do-not-call registry. At the time, the online ad industry strongly opposed the idea of a government-run no-tracking list.

Currently, many people who want to opt out do so through cookies, either on a company-by-company basis or through the Network Advertising Initiative's opt-out cookie (which allows users to opt out of targeting from many of the largest companies). But those opt-outs aren't stable because they're tied to cookies, which often get deleted.

The Network Advertising Initiative recently rolled out a browser plug-in that enables consumers to opt out of targeted ads by NAI members.

Leibowitz also told lawmakers that he personally favored opt-in consent to behavioral targeting, or receiving ads based on sites visited. "I think opt-in generally protects consumers' privacy better than opt-out, under most circumstances," he said. "I don't think it undermines a company's ability to get the information it needs to advertise back to consumers."

Online ad companies say that behavioral targeting is "anonymous" because they don't collect users' names or other so-called personally identifiable information, but Leibowitz said that it might be possible to piece together users' names from clickstream data. He told lawmakers about AOL's "Data Valdez," which involved AOL releasing three months' of "anonymized" search queries for 650,000 users. Even though the company didn't directly tie the queries to users' names, some were identified based solely on the patterns in their search queries. Several lawmakers expressed concerns with behavioral advertising during Tuesday's hearing. Sen. Claire McCaskill (D-Mo.) said she was "a little spooked out" about online tracking and ad targeting.

McCaskill said that after reading online about foreign SUVs, she noticed that she was receiving ads for such cars. "That's creepy," she said, likening it to someone following her with a camera and recording her moves.

She added that if an "average American" were to learn that someone was trailing him around stores with a camera, "there would be a hue and cry in this country that would be unprecedented."

Sen. Jay Rockefeller (D-W. Va.) and Sen. John Kerry (D-Mass.) both expressed concern that privacy policies weren't giving Web users enough useful information about online ad practices.

Rockefeller proposed that some companies were burying too much information in lengthy documents that consumers don't read. "Some would say the fine print is there and it's not our fault you didn't read it," he said, adding, "I say, that's a 19th-century mentality."

Kerry added that he didn't know that consumers understood how companies use data. "I'm not sure that there's knowledge in the caveat emptor component of this," he said.

Social cues, social responses, humans know when a computer is engaging them | Real Estate Relativity

Social cues, social responses, humans know when a computer is engaging them

Posted on Wednesday, 2010, July 28, 17:18, by Eric Bryn, under social media, social media and direct marketing research.

This research paper from Nokia Research Center, Stanford, and Queens University implies that humans can ascertain with an uncanny degree of certainty when a social message is sent from a computer versus a human. Social responses to communication technologies theory (SRCT)  predicts that humans cannot reliably ascertain such nuances. This research contradicts this premise.

The research team, using prior research in SRCT theories, tested whether humans could discern whether a text message was sent via a human or computer when flattery was an element of the message. They found that humans reliably discern the originator of the message apparently because certain social cues were missing in the computer-generated messages.

Why this is relevant research: SRCT theories could be used by software designers to create computer programs to engage social network users with the goal of getting them to increase self-disclosure under the guise of an interaction seemingly being conducted with a human. With the FTC recently considering allowing people to opt-out of behavioral targeting on the Web, the issue of nudging people towards more self-disclosure is timely given all the issues surrounding privacy and use of PII in social networks, especially if a user discloses such PII under the assumption they’re interacting with a human. This is a very interesting article and quick read (four pages).

Email thisAdd to del.icio.usDigg This!Technorati LinksShare on Facebook

Tags:
Comment (RSS)  |  Trackback