Responsible AI speakership

A gentleman in the audience: “Can the AI model detect sarcasm?”
Me: “Good question.. not yet..”

I’m picking up on being Responsible with my AI Speakership.
An AI speaker sweeps the audience off their feet with the latest and greatest innovations in tech. A Responsible AI Speaker shows them both sides of the coin.

It is important that technologists know what works and what does not.

My go to reference is Microsoft’s Transparency Notes for AI services.

It manifests the Responsible AI principles of Accountability – Designers and developers of AI systems need to create systems with maximum transparency.

An AI system is not just about the technology. It includes the people who will use it, the people who will be affected by it, and the environment in which it is deployed.

When we know what works and what doesn’t, we know our options and can make choices. We control our destiny.

For the gentleman’s question-
Excerpt from the Transparency note for Sentiment Analysis
“The model may have problems recognizing sarcasm. Context, like tone of voice, facial expression, the author of the text, the audience for the text, or prior conversation can be important to understand the sentiment in some cases. Often with sarcasm, additional context is needed to recognize if a text input is positive or negative. Given that the service only sees the text input, classifying sarcastic sentiment can be less accurate. For example, that was awesome, could be either positive or negative depending on the context, tone of voice, facial expression, author and the audience.”

Image: Yesterday at the dev up Conference – St. Louis; practicing Responsible AI Speakership

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog at

Up ↑

%d bloggers like this: