The future of security and safety technologies

Businesses and communities across the country have been forced to grapple with how important our security is in a world with increasing threats to our data privacy, our places of worship, our schools, our families and our businesses. Unfortunately, we’ve seen very little progress made in terms of solving the security threats we face and, over the course of the last decade, incidents both at home and abroad have reignited the debate around what an appropriate solution is. That debate has raised valid questions about solutions that don’t go far enough, as well as those that perhaps go too far.

This push and pull has set the stage for major conflicts between the “duty of care” these businesses and institutions have with the “individual privacy” of the people they serve. And, there seems to be no easy answer here as evidenced by the plethora of lawsuits filed in venues around the country regarding privacy and the use of security technologies. The capabilities of new technologies, such as facial recognition; AI data analytics that can predict behaviors and highlight socio-demographic characteristics along with mood analyses (happy, sad, angry, frightened, etc.); IoT solutions that allow mapping and tracking of people, the ability to detect sounds and define one vs another — e.g.—gunshot detected, fence-climbing detected; new and improved video surveillance; combined with the pace at which these new technologies are hitting the market is far-outpacing society’s ability to absorb and implement them in a way that balances personal and institutional privacy.

There are difficult questions for private citizens and businesses. The answers to those questions will affect whether acceptance of these new technologies in the market will be widespread or not.

Schools take annual pictures of their student body as well as their teachers, administrators and facilities teams. Should those photos’ sole purpose be for parents to display at home and in their offices, orshould they also be uploaded into a database to ensure that schools know who is coming on campus and if they have been expelled or suspended and shouldn’t be on campus? Should mood detection be used by schools to ensure students are paying attention and getting adequate instructional time? The privacy of our children and our families is at the forefront of every parent’s mind, causing these types of monitoring solutions to get serious pushback. Parents have legitimate concerns about how pictures of their children will be used. What if the database of student and staff photos is hacked? How will that impact their child and their family? Yet, in our ever-increasing digital society, we often voluntarily post pictures of ourselves and our family at various sporting events, parties and at home. Given that fact, are simple access control systems using facial recognition too injurious of our privacy, or does their utility outweigh potential invasions of privacy? How should a school, faced with the urgent need to protect their students and staff, walk that fine line between duty of care and privacy?

In other soft targets, is it inappropriate to provide staff with panic buttons that track their location?

Given the technology, panic buttons cannot only notify appropriate security staff of a threatening incident but pinpoint the location of the staff member being threatened to within as little as a meter.

Sounds good, right? But if doing so means being able to track that individual as they move through the floor, working — is that an invasion of their privacy? Most panic buttons allow the “tracking” aspect to be turned off until the button is pushed thereby overcoming reasonable concerns about privacy.

But let’s take it one step further. Using AI and predictive modeling technology can identify behaviors of individuals as they walk through a venue and make determinations about whether the individual is acting in a manner that could be dangerous or life-threatening. All this is possible today because of major technological advancements. The question facing us all is whether that evolution crosses a line, or if it provides businesses, schools and organizations the tools they need to ensure they meet their duty of care?

Drawing a line between duty of care and privacy will no doubt be hotly debated and likely litigated through our justice system over the next few years, but it is incumbent upon all of us to assist in navigating the waters. It is in our best interest to ensure we play our part in this evolution, so our businesses and our communities have the tools and solutions necessary to keep themselves secure.

Peg McGregor is the CEO of Technovation Solutions. She has more than 25 years of entrepreneurial experience in business development, strategic partnering, integrated and strategic marketing and business strategy development. McGregor has served as managing partner and president of numerous organizations providing digital strategy, data analytics, CRM, business creation and development strategy and execution.

Don't miss the big stories. Like us on Facebook.
pos-2 — ads_infeed_1
post-4 — ads_infeed_2