🕹️ User control – And trust issues


Lately I’ve been obsessing about Affective computing and where that is taking us. Maybe we are obsessing over AI (artificial intelligence) for no good reasons. Maybe AI is just another tool for making us consume more and making us lazier than we already are. As usual, I have no clue.

I believe that AI and affective computing can be used for so much good stuff, I have been reading basically everything published by Rosalin Picard (ain’t she cool). For example it can be used for optimising digital learning. Learning agents for people that have cognitive impairments and similar. It can help us find better and more relevant ways to present information for our users specific needs. But I’m guessing like all cool engineering stuff, most of the research is focused on “how can we sell more to people” and “how can we make the world a little shittier than it already is”.

But now when AI and machine learning is so god damn hot, and everything needs to be automated we’re running the risk of using AI for everything, a lot of things that certainly don’t need to be ‘intelligent’, read this article, it is very good. But my biggest fair is that we’re getting too fucking close to real artificial intelligence, where interfaces and services start running there agents 24-7, and start taking decision without our knowledge (I know it’s already happening). It’s like the shake and bake issue, no one gets satisfied without getting a little dirty. We need to crack that egg, mix it together and shove it one the oven to feel like we’ve accomplished something. We need to take action, not only for the feeling of accomplishment but actually to maintain some sort of control. We have apps and services that helps us with everything. Even if the action would only be pushing a button, I think we need to keep that control in our life, to feel satisfied and to feel safe. We wouldn’t say yes to a robot wiping our asses, some stuff needs to be left for manual labour.

The article I linked above, gives us examples of questions we as product developers can ask yourself when it comes to use AI or not to use AI. And in this topic I also read this article, which I’m going to quote now

The base emotional state isn’t the most relevant piece here, but the permission that the user gives to enter that emotion is

As with any relationship, trust is key, and the ways brands craft their value propositions, brand personalities, and their communication strategies play crucial roles in maintaining and deepening that trust.

Yes yes, very clever. This is not an original idea, I have very few of those. I just like to ventilate a little. I think this goes hand in hand with “don’t trust trends”.

I will come back on this topic, be sure.

//Moa