Playback speed
×
Share post
Share post at current time
0:00
/
0:00

It's for your own good

#DailySignals - your 2 minute preview of the future

Personal wellbeing patents that can not just monitor but predict and (so they say) prevent burnout and other mental health breaks are the topic of today's signal. Microsoft follows in Apple's footprints to file a patent for technology that will combine machine learning with biometrics (like your heart rate and typing speed) and other "social cues" (that would be monitoring your messages and calendar entries) to anticipate when you're heading for a breakdown.

But what about the conflict between corporate's responsibility to their own clients and the "greater good" and individuals rights to personal sovereignty when it comes to monitoring mental health?

(Tracey Follows has written much more extensively than I on this topic and I highly recommend you follow her for more in depth insights on to where this is headed.)

When it comes to the greater good, school shootings and chat bot inspired suicides are sobering reminders that mental health is a health and safety issue for society at large. At the same time, are we ready for our personal chat messages to be used to alert our employers (staff and management mental health is a growing concern and significant corporate cost), our insurers and our national authorities when we may be a danger to ourselves or others?

What are your thoughts?

Is neuro biometric surveillance a necessary measure?

What obligations do tech companies have to their clients and countries when it comes to monitoring personal physical and mental health?

Should bio and neuro monitoring be opt in or no opt out?

Leave a comment

Read more:

Discussion about this video

Thoughts
Thoughts
Authors
Bronwyn Williams