All posts by Danny Lieberman

About Danny Lieberman

Born in Washington DC, lives in Israel. Danny has a graduate degree in solid state physics and is a professional software security analyst, serious amateur saxophonist and XC rider.

Use ML and AI to automate IT operations

Use ML and AI to automate IT operations

Gartner is the world’s leading research and advisory company for IT. They pretty much invented the space. Gartner often invents new models — one of them is AIOps. The idea is to use machine learning to drive IT operations.

  • Performance management (“Observe”)
  • Automation (“Act”)

What’s AIOps?

Besides combining AI with a short catchy word like Ops (which sounds like Black Ops, IT Ops, Clinical Ops) — AIOps is the evolution of IT operational analytics (ITOA). It evolved out of the extreme growth in cloud computing.

  • The amount of data that we need to retain is exponentially increasing. Performance monitoring is generating exponentially larger numbers of events and alerts. Service ticket volumes experience step-function increases with the introduction of IoT devices, APIs, mobile applications, and digital or machine users. Again, it is simply becoming too complex for manual reporting and analysis.
  • Infrastructure problems must be addressed at ever-increasing speeds. As organizations digitize their business, IT becomes the business. The “consumerization” of technology has changed user expectations for all industries. Reactions to IT events — whether real or perceived — need to occur immediately, particularly when an issue impacts user experience.
  • More computing power is moving to the edges of the network. The ease with which cloud infrastructure and third-party services can be adopted has empowered line of business (LOB) functions to build their own IT solutions and applications. Control and budget have shifted from the core of IT to the edge. And more computing power (that can be leveraged) is being added from outside core IT.
  • Developers have more power and influence but accountability still sits with core IT. As I talk about in my post on application-centric infrastructure, DevOps and Agile are forcing programmers to take on more monitoring responsibility at the application level, but accountability for the overall health of the IT ecosystem and the interaction between applications, services, and infrastructure still remains the province of core IT. ITOps is taking on more responsibility just as their networks are getting more complex.

Automation does not replace humans

It should be noted that an acknowledgement that cloud computing is exceeding human scale does not mean that the machines are replacing humans. It means we need automation to deal with the new reality. Humans aren’t replaced, but operations people will need to develop new skills. New roles will emerge.

The elements of AI-driven operations

I’m going to take a moment here to go through the elements of AIOps as represented in the Gartner diagram above.

  • Aggregated big data platform. At the heart of the platform, the center of the above graphic, is big data. As the data is liberated from siloed tools, it needs to be brought together to support next-level analytics. This needs to occur not just offline — as in a forensic investigation using historical data — but also in real-time as data is ingested. See my other post for more on AIOps and big data.
  • Machine learning. Big data enables the application of ML to analyze vast quantities of diverse data. This is not possible prior to bringing the data together nor by manual human effort. ML automates existing, manual analytics and enables new analytics on new data — all at a scale and speed unavailable without AIOps.
  • Observe. This is the evolution of the traditional ITOM domain that integrates development (traces) and other non-ITOM data (topology, business metrics) to enable new modalities of correlation and contextualization. In combination with real-time processing, probable-cause identification becomes simultaneous with issue generation.
  • Engage. The evolution of the traditional ITSM domain includes bi-directional communication with ITOM data to support the above analyses and auto-create documentation for audit and compliance/regulatory requirements. AI/ML expresses itself here in cognitive classification plus routing and intelligence at the user touchpoint, e.g., chatbots.
  • Act. This is the “final mile” of the AIOps value chain. Automating analysis, workflow, and documentation is for naught if responsibility for action is put back in human hands. Act encompasses the codification of human domain knowledge into the automation and orchestration of remediation and response.

The future of AI-driven operations

As IT moves beyond human scale, we need to automate even more. But simply reacting defensively is not enough.

  • The automation of technology, and, hence, business processes: Costs lower, speed increases, and errors decrease while freeing up human capital for higher-level achievement.
  • Enterprise ITOps gains DevOps agility: Continuous delivery extends to operations and the business.
  • Data becomes currency: The vast wealth of untapped business data is capitalized, unleashing high-value use cases and monetization opportunities.

 

Originally published on Medium.

Security is not fortune telling

3 things to do before you spend money on a HIPAA consultant

Before you start spending money on regulatory consultants get back to basics.

Do you or do you not need to comply with the HIPAA Security Rule?

If you do – what is the very first thing you should do?

In this post – we will get back to basics with 3 practical ways of complying and reducing your regulatory risk.

I specialize in cyber security and privacy compliance consulting for medical device companies in Israel.  While this may sound like a niche, it is actually a very nice and not so small niche – with over 700 biomed vendors and a market growing 7-8% / year.

Israeli biomed startups are incredibly innovative and it’s fun working with smart people.  Here are 3 ways to improve your HIPAA risk analysis in just a few minutes:

Check # 1 – Maybe you are not a HIPAA Business associate

If you are a medical device vendor and you connect to a hospital network or share data  with doctors, you are automatically a BA; a Business associate according to the HIPAA Business Associate definition.   If you are a BA – you need to comply with the HIPAA Security Rule. But maybe you are not a BA.

By law, the HIPAA Privacy Rule applies only to covered entities – health plans, health care clearinghouses, and certain health care providers. A “business associate” is a person or entity that performs certain functions or activities that involve the use or disclosure of protected health information on behalf of, or provides services to, a covered entity.  If an entity  does not meet the definition of a covered entity or business associate, it does not have to comply with the HIPAA Rules.  See definitions of “business associate” and “covered entity” at 45 CFR 160.103.

So – if you developed a consumer mobile app that monitors stress  with a cool dashboard to visualize data stored in the cloud enabling your users to reduce their stress and compare themselves with other people like them; so long as the software doesn’t share data with a covered entity (like their doctor) – you are not a BA. Check.

 Check # 2 – Maybe you don’t even need to comply with HIPAA

HIPAA applies to storage of EPHI (electronic protected health information).   Simply put – EPHI is the combination of PII (personally identifiable information – such as a name, email and social security number) and healthcare / clinical data.

So – getting back to our hypothetical mobile medical app for personal stress tracking, let’s suppose that the stress data includes heart rate and respiratory rate in addition to how many times/day the consumer called their girl friend to make sure she still loves them  (when they are freaking out with stress before a big exam). If your mobile medical app for stress management doesn’t store personal information with the clinical data, then you don’t have to comply with HIPAA because you don’t you don’t create, store or transmit EPHI in the cloud. Check.

Check # 3 – Using contractors for software development? Vet them and oversee them

There is commonly-used expression in Hebrew – “launch and forget” (שגר ושכח).  I believe that this is a Hebrew translation of the American English term “Fire and forget” that refers to a type of missile guidance which does not require further guidance after launch.

When it comes to contractors you do not want to “launch and forget”.

Maybe you are a BA and you have to comply with HIPAA – just because the HIPAA Security Rule does not have a standard safeguard for vetting contractors, can you afford to gloss over this area?

This is a big one boys and girls.   If you use contractors for code development, make sure you thoroughly vet them before engaging with them on upwork.  I am not talking about quality of work – it is a given that you need someone highly competent in whatever problem problem domain you are trying to crack (your girl-friends brother-in-law may not be your best fit).  I am talking about the threat of a contractor stealing your code, dropping Android malware into your app, or enabling tap-jacking to steal personal data).

Even if they don’t steal your code, consider the threat of your contractor working for a competitor, leaking your IP or being hired by a business partner who click-jacks your entire business model.

It’s tempting to work with software developers in Ukraine or China but be careful. Consider the case of Russian programmers who wrote code for U.S. military communications systems or the  DOJ accuses firm that vetted Snowden of faking 665,000 background checks. 

In the Federal space, it is sad but true that there is huge a Federal procurement machine that enables these kinds of attacks to happen because of organizational complexity ( a nice way of saying that accountability falls between the cracks) and greed  ( a not so nice way of saying that big contracts are a challenge to morality and governance).

US Federal agency purchasing managers who sign purchase orders without appropriate contractor vetting and oversight are not a justification for a privately-held Israeli medical device company to sin…

The calls for more legislation are a knee-jerk reaction to the horses having left the barn years ago but when common sense and good business practices yield to greed then perhaps we do need a tail wind. Vetting software development contractors should be a standard requirement in the core security guidance such as HIPAA, PCI and FDA cyber security but in the meantime – don’t wait for the US Federal Government to tell you to vet your contractors and oversee their work. Put an accountability clause into your contracts.

The golden rules of medical IoT HIPAA compliance

In the past 5 years, a lot has happened in the digital health space. Venture funding in 2018 was close to $10BN and a lot of work is being done.

As our customers progress through their clinical trial journey to FDA clearance and post-marketing, we are frequently asked on how to achieve HIPAA compliance in an era of digital health apps, medical IoT and collection of RWD (real-world data) from patients. I will try and help you make sense out of HIPAA and the HITECH Act (Health Information Technology for Economical and Clinical Health).

On January 25, 2013, the HIPAA Omnibus Rule was published in the Federal Register, which created the final modifications to the HIPAA privacy and security rule. You can see the source of the law here.

The HITECH Act created a supply chain trust model.

According to 45 CFR 164.502(e), the Privacy Rule applies only to covered entities (healthcare providers, health plans and healthcare clearinghouses). Going down the chain, covered entities have suppliers who are defined as BAS (business associates). A business associate is a supplier that creates, receives, maintains, or transmits protected health information on behalf of a covered entity or other business associates.

The HITECH Act requires suppliers in the chain of trust to comply with the Security Rule.   A medtech company and its’ cloud service providers, customer engagement service providers et al are all business associates.

The HITECH Act does not impose all Privacy Rule obligations upon a BA but:

1.BAs are subject to HIPAA penalties if they violate the required terms of their BA Agreement (BAA).

2.BAs may use or disclose PHI only in accordance with the required terms of its BAA

3.BAs may not use or disclose PHI in a manner that would violate the Privacy Rule if done by the CE

Down the supply chain and to the right

When we go downstream in the supply chain, the BAA becomes more and more restricted regarding permissible uses and disclosures.

For example, if a business associate agreement between a covered entity and a supplier does not permit the supplier to de-identify protected health information, then the business associate agreement between the supplier and a subcontractor (and the agreement between the subcontractor and another subcontractor) cannot permit the de-identification of protected health information. Such a use may be permissible if done by the covered entity, but is not permitted by the downstream suppliers in the supply chain, if it is not permitted by the covered entity’s business associate agreement with the contractor.

Concrete example of a digital therapeutic.

A physician (covered entity) prescribes a digital therapeutic app. The physician writes a script that is sent to a customer service center, which provides customer support to patients to download and use the app.

The healthcare provider will need a BA with the digital therapeutics company (or its customer service center that may be a separate business), who then has BAAs with other online suppliers for cloud and Braze customer engagement services. Graphically, the supply chain looks like this:

As we move down the supply chain and to the right, we see that the suppliers are providing specific and more restricted digital services.

Digital therapeutics HIPAA

 

The golden rule

Although a BA is a formal, regulatory requirement, it includes compliance with the HIPAA Security Rule and possible exposure to Privacy Rule disclosures. To a large degree, the Golden Rule applies – “He who has the gold rules”.   For early stage medtech and digital therapeutics companies, your customers have the gold. Do a good job on your homework on your security and privacy risk assessment.  Consider external threats as well as possible exploits and cascade attacks on your APIs.

Killed by code – back to the future

Back in 2011, I thought it would only be a question of time before we have a drive by execution of a politician with an ICD (implanted cardiac device).

In Jan 9, 2017 FDA reported in a FDA Safety Communication on “Cybersecurity Vulnerabilities Identified in St. Jude Medical’s Implantable Cardiac Devices and Merlin@home Transmitter.

At risk:

  • Patients with a radio frequency (RF)-enabled St. Jude Medical implantable cardiac device and corresponding Merlin@home Transmitter
  • Caregivers of patients with an RF-enabled St. Jude Medical implantable cardiac device and corresponding Merlin@home Transmitter
  • Cardiologists, electrophysiologists, cardiothoracic surgeons, and primary care physicians treating patients with heart failure or heart rhythm problems using an RF-enabled St. Jude Medical implantable cardiac device and corresponding Merlin@home Transmitter

I’ve been talking to our medical device customers about mobile security of implanted devices for over 4 years now.

I  gave a talk on mobile medical device security at the Logtel Mobile security conference in Herzliya in 2012 and discussed proof of concept attacks on implanted cardiac devices with mobile connectivity.

But – ICD are the edge, the corner case of mobile medical devices.  If a typical family of 2 parents and 3 children have 5 mobile devices, it is a reasonable scenario that this number will double withe devices for fetal monitoring, remote diagnosis of children, home-based urine testing and more.

Mobile medical devices are becoming a pervasive part of the Internet of things; a space of  devices that already outnumber workstations on the Internet by about five to one, representing a $900 billion market that’s growing twice as fast as the PC market.

There are 3 dimensions to medical device security – regulatory (FDA), political (Congress) and cyber (vendors implementing the right cyber security countermeasures)

The FDA is taking a tailored, risk-based approach that focuses on the small subset of mobile apps that meet the regulatory definition of “device” and that:

  • are intended to be used as an accessory to a regulated medical device, or
  • transform a mobile platform into a regulated medical device.

Mobile apps span a wide range of health functions. While many mobile apps carry minimal risk, those that can pose a greater risk to patients will require FDA review. The FDA guidance document  provides examples of how the FDA might regulate certain moderate-risk (Class II) and high-risk (Class III) mobile medical apps. The guidance also provides examples of mobile apps that are not medical devices, mobile apps that the FDA intends to exercise enforcement discretion and mobile medical apps that the FDA will regulate in Appendix A, Appendix B and Appendix C.

Mobile and medical and regulatory is a pretty sexy area and I’m not surprised that politicians are picking up on the issues. After all, there was an episode of CSI New York  that used the concept of an EMP to kill a person with an ICD, although I imagine that a radio exploit of  an ICD or embedded insulin pump might be hard to identify unless the device itself was logging external commands.

Congress is I believe, more concerned about the regulatory issues than the patient safety and security issues:

Representatives Anna Eshoo (D-CA) and Ed Markey (D-MA), both members of the House Energy and Commerce Committee sent a letter last August asking the GAO to Study Safety, Reliability of Wireless Healthcare Tech and report on the extent to which FCC is:

  • Identifying the challenges and risks posed by the proliferation of medical implants and other devices that make use of broadband and wireless technology.
  • Taking steps to improve the efficiency of the regulatory processes applicable to broadband and wireless enabled medical devices.
  • Ensuring wireless enabled medical devices will not cause harmful interference to other equipment.
  • Overseeing such devices to ensure they are safe, reliable, and secure.Coordinating its activities with the Food and Drug Administration.

At  Black Hat August 2011, researcher Jay Radcliffe, who is also a diabetic, reported how he used his own equipment to show how attackers could compromise instructions to wireless insulin pumps.

Radcliffe found that his monitor had no verification of the remote signal. Worse, the pump broadcasts its unique ID so he was able to send the device a command that put it into SUSPEND mode (a DoS attack). That meant Radcliffe could overwrite the device configurations to inject more insulin. With insulin, you cannot remove it from the body (unless he drinks a sugary food).

The FDA position that it is sufficient for them to warn medical device makers that they are responsible for updating equipment after it’s sold and the downplaying of  the threat by industry groups like The Advanced Medical Technology Association is not constructive.

Following the proof of concept attack on ICDs by Daniel Halperin from the University of Washington, Kevin Fu from U. Mass Amherst et al “Pacemakers and Implantable Cardiac Defibrillators:Software Radio Attacks and Zero-Power Defenses”  this is a strident wakeup call to medical device vendors  to  implement more robust protocols  and tighten up software security of their devices.

safeguard your head office small business

A word to Teva on firing employees and assuring data security

To be able to do something before it exists,
sense before it becomes active,
and see before it sprouts.

The Book of Balance and Harmony (Chung-ho chi).
A medieval Taoist book

In early December 2017, the Israeli pharmaceutical generics company Teva announced it would lay off about 1,700 of its employees in Israel, who make up about 25% of all the company’s employees in Israel, out of a total workforce of 6,680 employees.    Without diving into the emotional implications and political opportunities the big layoff creates – I suggest taking a different look at the problem.

What kind of risk are you creating when you fire a big chunk of your work force?

When a big global, publicly-traded company like Teva decides to fire a big piece of it’s work force – it’s to reduce costs in anticipation of reduced revenues and preserving or improving the share prices.

Risk management and IT governance runs a distant second and third when it’s a question of survival. The IT department is often in the line of fire, since they’re a service organization. The IT security staff may be the first to get cut since  companies view information security as a luxury, not as a must to run the business.

There is nothing in the information security policy of any organization that I have seen that talks about how to manage risk when 1700 employees are being fired in a short period of time in a business unit.

When firing large numbers of employees, the unauthorized network transfer of sensitive digital assets belonging to the company should be (but is rarely) a key concern for the CEO. Here are a few true examples of trusted insider theft of digital assets and intellectual property  during a big RIF – all cases are true:

  • Sending suppliers  classified RFP documents
  • Exploiting production servers with anonymous file transfer protocol (FTP) turned on in order to send large quantities of confidential product design documents
  • Break-ins, bribes and double agents (workers who spy for other groups or companies) taking advantage of the chaos caused by RIFs and strikes.

The business need to use advanced technology to detect and prevent data loss drives directly to the CEO and his management team, and in firms with outsourced IT infrastructure (like Teva), the need for data loss prevention becomes more acute as more and more people are involved with less and less allegiance to the firm.

High risk appetite and waiting until the last minute?

In my experience (and this is supported by prospect theory), highly paid CEOs wildly underestimate to the point of ignoring them completely, high impact, low frequency events like trusted insiders and outsourced IT staffers stealing IP during a big RIF.

In normal times, a key part of formulating and establishing information security   policies for your organization is in deciding how much risk is   acceptable and how to minimize unacceptable risk.

This process initially involves undertaking a formal risk assessment which is a  critical part of any ISMS.  However – it’s a mistake to assume that risk assessment is a static process when the business is a dynamic process.

Risk assessment must be dynamic and continuous, moving at the front line of the business not as an after though or not at all.

When a company fires wide-scale – the word dynamic and continuous takes on new meaning.  We are no longer in Kansas anymore when we can ask KPMG to come in and do an organizational risk assessment using their standard questionnaires.

In times of massive layoff – you need to throw away the standard forms and use a threat-analysis based checklist to reevaluate your digital value at risk on a daily basis.  The rationale behind the threat analysis is to mitigate the tendency of top management to ignore high-impact, low-frequency events:

  • Think like an attacker.  What would you steal if you had the opportunity?
  • Use systematic approach to estimate magnitude of risks (risk  analysis).
  • Compare estimated risks against risk criteria to measure the  significance of the risk (risk evaluation)
  • Define the scope of the risk assessment process to improve  effectiveness (risk assessment)
  • Undertake risk assessments periodically to address changes in  assets, risk profiles, threats, safeguards, vulnerabilities and risk  appetite (risk management)
  • Risk measurement should be undertaken in a methodical manner to  produce verifiable results (risk measurement)

 

hipaa cloud security

WannaCrypt attacks

For your IMMEDIATE notice: If you run medical device Windows management consoles, run Windows Update and update your machine NOW. This is my professional advice considering the new ransomware worm out there attacking machines

MS17-010 has been out more than a month, but we have to assume that that the majority of Windows-based medical devices and Windows-based medical device monitoring platforms are vulnerable.

 For Windows XP, Windows 8, and Windows Server 2003 systems,  the patches still aren’t available though windows update – you need to get them from https://blogs.technet.microsoft.com/msrc/2017/05/12/customer-guidance-for-wannacrypt-attacks/ and deploy manually.

Information Security Best Practices

What is more important – patient safety or hospital IT?

What is more important – patient safety or the health of the enterprise hospital Windows network?  What is more important – writing secure code or installing an anti-virus?

A threat analysis was performed on a medical device used in intensive care units.  The threat analysis used the PTA (Practical threat analysis) methodology.

Our analysis considered threats to three assets: medical device availability, the hospital enterprise network and patient confidentiality/HIPAA compliance. Following the threat analysis, a prioritized plan of security countermeasures was built and implemented including the issue of propagation of viruses and malware into the hospital network (See Section III below).

Installing anti-virus software on a medical device is less effective than implementing other security countermeasures that mitigate more severe threats – ePHI leakage, software defects and USB access.

A novel benefit of our approach is derived by providing the analytical results as a standard threat model database, which can be used by medical device vendors and customers to model changes in risk profile as technology and operating environment evolve. The threat modelling software can be downloaded here.

Continue reading

informed-consent-consideration

Why HIPAA Policies and Procedures are not copy and paste

Compliance from Dr. Google is a very bad idea.

Searching for HIPAA Security Rule compliance yields about 1.8Million hits on Google. Some of  the information is outdated and does not relate to the Final Rule and a good deal of other information is sponsored by service providers and technology companies selling silver bullets for HIPAA compliance.

The online dialog on HIPAA Security Rule compliance is dominated by discussions by requirements for health providers.   There is very little information online for the downstream medtech and medical device vendors who are increasingly using the cloud to store data and process transactions for their covered entity customers

If you are a small medtech or medical device company, and you copy from a big healthcare provider you will be overpaying and over-implementing SOP’s which may not be relevant to your business situation.

The risk analysis for a SaaS provider or medical device that stores PHI in the cloud is not even remotely similar to the risk analysis for a hospital.

If you copy and paste a risk analysis – you won’t understand what you’re doing and why you’re doing it and since HIPAA privacy infractions carry both a criminal civil penalty, you don’t want to even attempt to comply via Google.

For example – if you are a mobile medical device vendor – you will need to take into account technology and privacy considerations such as mobile app software security, application activity monitoring, mobile and cloud data encryption and key management none of which may be relevant for a traditional IT hospital-run electronic health records system.

What policies and procedures do I need for HIPAA compliance?

We provide clients with  a bespoke package of SOP’s which are required by HIPAA – Acceptable use, Incident response, Security and risk management process, Disaster recovery plan, and Security Officer Job description (which is required by the Final Rule).     This is in addition to the Risk Analysis / Security Assessment report (§ 164.308(a)(1)(ii)(A) ).

6 reasons why  HIPAA security policies and procedures are not copy and paste:

  1. It depends on the business situation and technology model. A biotechnology company doing drug development will not have the same threat surface as a mobile health company.
  2. Your security is worse than you think. When was the last time you looked? Or had an external audit of your encryption policies?
  3. It also depends on timing – if the life science company is doing clinical research, then informed consent may release the sponsor from HIPAA privacy rule compliance. But in clinical research, physician-investigators often stand in dual roles to the subject: as a treating physician (who has to comply with the HIPAA Privacy Rule) and as a researcher (who has to comply with both GCP and 21 CFR Parts 50 and 56 regarding informed consent with adults and children).  In my experience with medical device companies, they often do clinical trials in parallel to commercial betas and work closely with physician-investigators. This means that your HIPAA Security Rule compliance posture needs to be nailed down in order to eliminate holes of privacy leakage.
  4. Life science companies have quality management systems as part of their FDA submissions – the HIPAA Security Rule policies and procedures need to become part of your quality system.We work with you to make sure that your regulatory/QA/QC leads understand what it means to implement them and help them integrate into their own internal formats, policies and training programs.
  5. Covered entities may also have impose specific  requirements in their BAA on the life science vendor;  we then need to customize the policies and procedures to comply with the their internal guidelines.     Sometimes these are quite strange like  the story of the West Coast hospital that deliberately weakened the WiFi signal of their routers in the thought that it was a security countermeasure for hacking their wireless access points.
  6. There are also situations of intersection with other Privacy regulation such as CA 1280.15 for Data breach – California is sticky on this and then if  you do business with U of C – there are will be additional things to cover

Feel free to contact us  for a free quote – we’re always looking for interesting projects.

 

Bridging the security IT gap with BI

Encryption and medical device cyber security

I have written pieces here, here, here and here on why encryption should be a required security countermeasure for network medical devices – but curiously, the HIPAA Security rule – Appendix A does not specifically require encryption.

The final FDA guidance on cyber security for medical devices takes a similar position that we’ve adopted over the years – namely analyzing threats (“Hazards”) and implementing a prioritized security countermeasure plan.

In our security and privacy compliance practice for biomed in Israel – we always take a position that the first step in determining the best and most cost-effective security countermeasures for your medical device is to develop a threat model and perform a threat analysis.  Encryption may or may not be the first security countermeasure you must implement in your mobile medical device (think about fixing interface bugs first…) but it will probably be in your top 5

Regardless of why the authors of the HIPAA security rule did not require encryption – it is instructive to take a look at the history of encryption and how we arrived at where we are today – where encryption is widely considered to the silver bullet of security.

History of EncryptionLearn about key cryptology events throughout
the ages with the Egress History of Encryption infographic. Egress
Software Technologies are providers of email encryption software and file encryption & large file transfers.

Anat kamm

Procedures are not a substitute for ethical behavior

Are procedures  a substitute for responsible and ethical behavior?

The  behavior of former secretary  of  State (and Presidential race loser) Hilary Clinton is an important example of how feeling entitled is not the exclusive domain of under 20-somethings. When we do a threat analysis of medical devices, we try to look beyond the technical security countermeasures and dive into the human factors of employees and managers of the organization.

Leadership from the front trumps security technology.

President Obama’s notion of leading from behind is problematic in the data security and governance space – leadership is about leading from the front.

President Obama’s weak position on enforcing data security and privacy in his administration (Snowden, Clinton and NSA) set a poor example that will take years to undo and probably cost Hilary Clinton the election.

In the business environment,  management leadership from the front on data security and privacy is a more effective (as in cheaper and stronger) countermeasure than technology when it comes to mitigating trusted insider threats.

In the family environment, we traditionally see parents as responsible for taking a leadership position on issues of ethics and responsible behavior.

Has mobile changed this?

Sprint  announced new services that  will allow parents to set phone use limits by time of day or week, see daily calls, text messaging and application activity of their children.  Sprint Mobile Controls powered by Safely, a division of Location Labs,  allows parents to see rich graphical representations of how their family calls, texts and use applications and to lock phones remotely at specific times.

For example:

  • Seeing who your son or daughter has been calling or texting recently – and how often.
  • Establishing an allowed list of phone numbers from which your child can receive a call or text.
  • Seeing a list of your child’s contacts with an associated picture ranked by overall texting and calling activity.
  • Viewing what apps your child is downloading to their phone.
  • Choosing up to three anytime apps that your child can use when their device is locked.
  • Allowing your child to override phone restrictions in case of an emergency.
  • Setting alert notifications for new contacts, or School Hours and Late Night time periods.
  • Setting Watchlist contacts: Receive alert notifications when your child communicates with a Watchlist contact.

This seems like a similar play to product and marketing initiatives by credit card companies to control usage of credit card by children using prepaid cards like the Visa Buxx – except in the case of Visa the marketing message is education in addition to parental control:  Visa Buxx benefits for parents and teens include:

  • Powerful tool to encourage financial responsibility
  • Convenient and flexible way to pay
  • Safer than cash
  • Parental control and peace of mind
  • Wide acceptance—everywhere Visa debit cards are welcome

Visa Buxx was introduced almost 10 years ago. I don’t have any data on how much business the product generates for card issuers but fast forward to December 2011, the message of responsibility has given way to parental control in the mobile market:

In the case of mobile phones, I can see the advantage of a home privacy and security product. From Sprint’s perspective; controlling teens is a big untapped market. Trefis. (the online site that analyzes stock behavior by product lines) has aptly called it “Sprint Targets Burgeoning Teen Market with Parents Playing Big Brother

The teen market, consisting of those in the 12 to 17 year age group, is plugged into cellular devices and plans to a much greater extent than you might imagine. According to a Pew Internet Research study, more than 75% of this group owns a wireless phone. This isn’t news to Sprint Nextel (NYSE: S) or mobile phone competitors such as Nokia (NYSE:NOK), AT&T (NYSE:T) and Verizon (NYSE:VZ).

I do not believe that technology is a replacement for education.

It will be interesting to track how well Sprint does with their teen privacy and security product and if parents buy the marketing concept of privacy controls as a proxy for responsible behavior.