Quantcast
Channel: Patients Know Best Blog » David Beyer
Viewing all articles
Browse latest Browse all 23

The Future of Privacy in Healthcare: An Interview with Dr. Deborah Peel

$
0
0

The following is an interview with Dr. Deborah Peel, a physician, national speaker on the issue of health privacy and the founder of Patient Privacy Rights.

Mohammad:  I’ve got a ton of questions about your work, so, Deborah, can you tell us about your background?

Deborah: I’m a physician. I graduated from medical school in 1974, and I went into private practice in general adult psychiatry in 1977. I’ve been in practice a very long time in the most privacy-sensitive specialty in Medicine.  Along the way, I became a Freudian analyst and learned even more about the importance of keeping healthy boundaries and protecting privacy. If you think about it, the kind of work that I’ve been doing is very collaborative. In fact, psychiatrists and mental health professionals may be among the most collaborative health professionals, bar none, because for the most part our work involves building relationships–trusted relationships–and helping people in ways that fit with who they are and what they understand.

I was chief of psychiatry for 11 years at the City of Austin’s Public Health Hospital, so at times I committed people against their will for treatment. The process of assessing what someone understands about treatment and whether they willingly agree to treatment is fundamental to the work of mental health professionals.  We really have to seek cooperation and collaboration with patients. So I’ve spent 35 years or more assessing who can meaningfully consent, and working on the conditions that improve disclosure and trust.

For example, the gold standard for record-keeping in psychoanalysis– this comes from Freud–is no records. He believed that creating records while seeing a patient actually interfered with thinking about and understanding the patient. Freud would see patients six times a week, now we see them in analysis four or five times a week. We actually know more about our patients than we know about almost anyone else. It’s one of the most intimate relationships in the world and in some ways more intimate than husband and wife, parent and child, or best friends. You’re not going to forget something important if you spend dozens or hundreds or thousands of hours with someone.

I learned about the need for privacy from my patients. In medical school and residency, as you know, a senior doctor has hordes of students, residents and interns follow him or her around. Anywhere from two or three to ten people stand at the foot of a bed and talk to patients. Those are terrible conditions for honesty, disclosure and trust.

When I first hung out my shingle people came into my office and said, “If I pay you cash will you keep my records private?” I was stunned. They asked me because even in the paper age, U.S. health records did not stay put in doctors’ offices. Insurers and third-party payers shared records with employers and others, affecting people’s jobs and reputations. This was something I never learned in medical school or residency.

As it turns out, in the U.S., ethical mental health professionals give patients Miranda warnings: If you use a third-party payer, anything you say can be used against you in the future. The same with taking medication; every single prescription record is sold every night from all 55,000 pharmacies in the U.S. This practice has been going on for at least 15 years. These are really terrible conditions for treatment and trust in physicians.  We would try to negotiate a fee that a person could afford without third-party involvement in treatment. In the paper age, each person had health records in only a few places. You might be seeing two or three doctors, you might have been in a hospital or two, but so only a handful of places had records about you, and only one person could look at your records at a time.

Fast forward to how things work today. In America, no one even knows all the places where their health information is stored, used, collected, disclosed or sold. The scale of access to and use of our most sensitive personal information is unimaginable. Gradually, I ended up changing from being a physician in full-time practice to a person who started an advocacy information, and that’s Patient Privacy Rights.

Patient Privacy Rights actually came about because I was a Freudian analyst.  Analysts study the best conditions for people to trust health professionals and disclose information to them.

We have learned that everything matters, even the physical set-up of our offices. It’s best not to have a secretary; it’s best for you to be the only person that is the contact with the patient; it’s best to have separate entrances and exits so patients don’t see one another; it’s best to have nothing too personal in the office like family photos; we use double or triple soundproofing. Everything is designed to convey we’re in a private space and the relationship is only between you and me. No one else is involved. We’re very careful about setting up the physical boundaries and technical boundaries too. Many analysts don’t use electronic records, which could be hacked or used in ways patients would not expect. State laws and federal laws require us to keep records, but many of us keep minimal records and keep them locked up. As I mentioned, the conditions that promote trust and disclosure are totally private relationships with healthy boundaries the patients expect. Treatment is just between two of us–you and me.

Lately, I’ve begun to think about why Americans believe that they should control the use of their health information. They didn’t go to medical school and haven’t heard of the Hippocratic Oath.  Why does everyone think they should control sensitive health information?  People in psychoanalysis or in psychiatric and psychological treatment often have emotional problems that arise from boundaries being broken. Many people are betrayed or abused physically or sexually, and emotionally. So establishing and maintaining healthy boundaries is very, very important to well-being and effective treatment.

To improve health and emotional well-being, trust is essential. But trust doesn’t scale. You can build trust with one person. You can’t trust that person’s best friend or spouse or colleague. Trust doesn’t transfer. And humans have strong needs for trusted relationships. Healthy humans also need to regulate or control boundaries with other people. What’s so distressing about Facebook is it gives kids and users the illusion that they’re setting the boundaries about who can see the sensitive, personal information they post. Users ‘friend’ 20 or 60 or 200 people and believe they let only a certain number of people see whatever they post. But in reality, Facebook users can’t prevent their data from being scraped and used in all kinds of hidden ways. The success of Facebook and social media, such as Google circles, depends on creating the illusion that you’re in control of who knows what and how much about you. The need to control who sees personal things about us is reflected in the laws that protect snail mail. It’s a felony if you open somebody’s snail mail. We’re going to eventually demand that same level of protection for email because no matter how often people are informed email isn’t private, they still share incredibly sensitive information even knowing that Google and the U.S. Government read it.

Our emotional need for privacy may even result from the properties of cells. Cells are self-regulating organisms. Cell membranes let certain things in and keep certain things out. If you rupture a cell membrane or if viruses penetrate it and it gets sick, the cell dies. Cells are among the smallest independent living organisms, life itself depends on physiological, self-regulating mechanisms. That’s why I think that emotionally, we have similar needs. We need to control our boundaries with other people, with relationships.

Essentially, I realized the U.S. had no organization that spoke for patients’ rights to health privacy. First, I spent ten years in medical organizations working to get them to defend medical ethics and the patient’s right to privacy. That didn’t work, so I finally woke up and thought: Who cares about this anyway? Who cares they can’t control who can see and use their medical records? The public cares. I started Patient Privacy Rights because I knew so much about the harms caused by the hidden flows of health data, and because insurers and employers dislike and discriminate against psychiatric patients because they can access health data.

I knew about privacy, the most urgent national issue in the Digital Age. I knew about the hidden “users” of medical information, and I knew we needed political strategists to be involved.

Three of my friends were immensely talented national-level lobbyists and agreed to serve on PPR’s first board. PPR was and still is a tiny organization, so we have to be strategic visionaries and think about what we do. In 2004, I started Patient Privacy Rights and by 2007, the readers of Modern Healthcare, the largest US trade journal on health and IT voted me the fourth most powerful person in healthcare in the U.S.

Mohammmad:  We were very impressed when we saw that.

Deborah:  I’ve also been on the list several times since then. And this year, I was on two more lists: I was named one of the top ten US experts in health information security and one of four “health IT iconoclasts”, along with Drs Larry Weed, Ross Koppel and Scott Silverstein.

Patient Privacy Rights has succeeded by building a powerful bipartisan coalition with over 50 organizations, from the Gun Owners of America to the ACLU, representing 10.3M citizens. From the very beginning we also reached out to technology companies that focused on the future. We sought innovative companies that were interested in privacy. When Microsoft joined our coalition in 2007, it was the largest technology company in the world. It made news and created a serious perturbation in the medical industrial complex in the United States. The biggest player was saying the patient should decide; practically, they signed our letters to Congress and worked with us to try to change the laws. They sided with us that patients should control their information.

The ultimate future of the healthcare system depends on building technologies that serve individuals, not hospital systems, not government, and not vendors. No matter how much we commercialize health care, no matter how many large organizations are involved, none of it works without a trusted relationship between two people. Even if doctors all work for companies, patients must be able to trust physicians or they won’t talk about what they’re afraid of, what’s frightening, painful, or scary. It was Hippocrates’ genius to realize that physicians had to keep information private to earn patients’ trust.

He reasoned, or perhaps learned through experience, that if he gossiped about his patients, no one would see him. He realized he could only provide the best care to his patients if they could trust him not to disclose sensitive personal information about them without permission.

I guess the only part that I left out was what led me to finally give up on pressing my professional association to defend privacy rights. In 2002, the Administration re-opened and amended the HIPAA Privacy Rule. The consent provisions were replaced with ‘regulatory permission’ for covered entities to use and disclose protected health information for treatment, payment and healthcare operations.

HHS literally took personal control of health records away from patients and gave it to data holders and institutions. The Amended HIPAA Privacy Rule granted covered entities a new right of ‘regulatory permission’, a new power to use and disclose patient health data without consent. The Amended Rule eliminated the strong rights to health information privacy patients had held for the past 150 years. This was a very radical change, but nobody knew it happened–it was never reported in the major news media. Suddenly, HHS eliminated patient control over health data in electronic health systems, which ultimately was the reason I started Patient Privacy Rights.

Mohammad:  As a patient, I’m very pleased that you started that. One of the things that frustrated me watching policy-makers is they would often have regulatory capture by the existing incumbents who were operating in incorrect ways, who convinced the policy-makers that their way was the only technically possible way of operating. So in effect, they’d just need to update their policies to turn what’s being done wrong into the correct way of doing things and that would be it. Whereas technology allows you to do things in the correct way, but that doesn’t make it to the policy-maker, which I guess is one of the things that you tried to make happen with Patient Privacy Rights.

Deborah:  Absolutely, yes. Industry sold the decision-makers and the policy-makers on the fact that this terrible technology being used was the only thing possible. They created a whole series of myths to justify the use of technologies that eliminate privacy rights. Myths such as consent interferes with data flow, consent is too hard and too expensive to build into health IT systems, that patients are too stupid and weak to make these decisions, on and on.

ABC did a 3-minute story last summer, which is online. An ABC investigative reporter sat at a table outside Starbucks with a security expert and a laptop. They opened up the laptop and the security expert showed the reporter identifiable diabetic patient records online for sale from $14.00 to $25.00 each.  Of course they opened the ABC-TV segment with someone whose records were being sold online saying, “oh my God, how did that happen?”

It’s been a very hard story to tell because the federal government, both the Republican and Democratic administrations, want to freely access and use the nation’s health data. Both President Bush and Obama agree on the goal that every American should have an electronic health record by 2014. Both administrations believe in virtually unobstructed use of the data for dozens of different purposes without consent.  They want to use health data without our consent to improve our health, lower costs, and enable research. Congress had already created broad new exceptions to the patient’s right of consent into the HIPAA statute: any research and public health use of health data would not require patient consent, and law enforcement was granted open access to the nation’s health data.

Still, when the Amended HIPAA Privacy Rule was implemented, the preamble stated that the Amended Privacy Rule was the “floor” for privacy protections, not the “ceiling”. The idea was that stronger laws, common law, tort law, constitutional rights, constitutional decisions, medical ethics and standards of practice in communities should prevail. But none of the systems were built to comply with ethics or those rights. Initially, health IT systems were built to allow open access to health professionals and employees of healthcare-related businesses. No one back then could have imagined a future where everyone in the nation would be wired, or that cell phones would replace computers. In the beginning, it seemed that only those directly involved in treating patients could afford the technology systems and be able to access health data.

Today we know that personal health data is the most valuable commodity in the digital age. We have to confront the facts: legacy electronic health care systems were architected to violate patients’ rights of consent.  When the government decided to fund and build an electronic health system for the nation, it made $30 billion available. Instead of requiring that the $30 billion be used to re-architect and redesign HIT so citizens could control their information– not the private companies and governments—the government allowed industry to expand the privacy-violating legacy systems already in place.  Basically, the government supported industry complaints that fixing existing systems would be too expensive, too difficult, and would stifle innovation. The government ‘bought’ typical complaints industry lobbyists always make when faced with proposed new regulations.

But if you think about it–it’s the opposite. Regulations actually spur innovation and create new businesses. By the way, do you think the auto industry would have ever voluntarily put seat belts and airbags into cars? No, why did that happen? Regulations. Regulations for the auto industry created whole new businesses, such as airbag and seat belt manufacturers, more efficient engines, less pollution, and they saved lives.

The situation for health IT is the same. It’s true the technology is more complex, but this is not a technology problem; privacy-enhancing technologies exist and more can be built.  My late father was an internationally renowned computer scientist and a finalist for the Nobel before he died, so I’m very aware that technology, code, can do whatever we want it to. Entrenched large institutions and business monopolies are the obstacle to fixing health technology systems because their profits are derived from controlling and using patient data.

Technology’s not the problem. The problem really is the high value of health data inside and outside healthcare systems.  At least in the U.S.– and I think that is what’s happening now in the EU too– American corporations are telling the European nations that they can make billions from selling their health data. They’re putting great pressure on the EU to do things differently.

Mohammmad:  This is the kind of stuff that gets the Europeans and the Canadians angry before their governments proceed anyway. Let’s do things the right way. Let’s have the technology follow the correct way of organizing the world.

What does patient control mean? How should it be done correctly? What do you mean when you say patients have to be in control of their data?

Deborah:  In the paper world, data didn’t flow between doctors or even to health insurers without a written consent. That’s the paper world, we had control of our health information then.  Essentially, in the digital world, we all need a single place where we can dictate all of our own rules for the use of personal health information. The rules could be designed exactly the way we think and feel. For example, any time anything happens to me health-wise, i.e. if I go to an emergency room, or I see a different doctor, my rules could dictate that I want a copy of those records to be put in my health record bank and a copy sent to my medical home, my family practice doctor, my internist or my gynecologist, etc.

We could set other rules too. For example, I want my other doctors, my allergist, my orthopedic surgeon, and my skin doctor to just get updates about my new medications. In other words, we could slice and dice who we would automatically and ordinarily want to receive basic information about us and who should get all of our detailed information, because no one never shares their entire records with a urologist or an orthopedist. We don’t tell the allergist the same things that we tell our psychoanalysts. We don’t want to and we don’t need to. By the way, the allergist doesn’t need to know what you tell your psychoanalyst. It’s not relevant.  Physicians want relevant information.

The point is that consent tools could be enormously detailed to reflect how you actually want your data to be handled, collected and used; and you could set up rules for research. For example, you can ask to be contacted about any studies on personal data related to diabetes, but not studies on schizophrenia. You may not want to participate in those. We should be able to set broad rules and detailed rules. The consent tools would enable us to instantly set and change control over any health data.

Tools should allow control down to specific data fields sometime soon. You should be able to have very robust control over different kinds of information. There are very few general surgeons who need to see the records of your marital therapy. There’s probably very little reason for the podiatrist to know about your child abuse. The consent tools would operate in a way that people naturally operate with different doctors. They don’t trust them all to the same degree and they don’t tell them all about the same problems or symptoms. That’s a fact. That’s how we operate.

The idea is every person will have a single consent system somewhere. Of course the security over even our directions or rules would have to be extremely good security, because even our directions or rules are very revealing. Before any user, holder, creator or transferer of data could do anything with our health data, they’d have to electronically automatically check our rules and receive a ‘yes’ or ‘no’. If there was a question, we could be pinged on our cell phones for an answer about the data use or on our computers.

There’s another thing about independent consent tools and services. They should replace the institutional review boards and privacy boards that decide who can use our data for research. Why is there any need to have those boards when technology makes it instantaneous, cheap and easy to contact millions of people to ask permission? People can be reached today in ways that they couldn’t in be reached in the paper age, for almost no cost. But as you know, today in the U.S. we don’t even have patient portals or physician portals in most cases, so the two people who really matter most in healthcare, the patient and the doctor, can’t talk to each other, can’t send each other information back and forth. The result is an absurd system where the two people who need the information most of all are the least likely to be able to have it or to use it in their relationship to improve health.

Mohammmad:  One of the things we try and teach people is that without the patient being in control, you both get an unnecessary over-sharing of data, so people don’t seek the patient’s consent and then just move the data around in a way that’s incorrect.

Deborah:  Yes, exactly.

Mohammmad:  But you also get under-sharing of data which is counterintuitive to people.  For example, in the UK, your family physician may say, “We’ll give it to the hospital doctor because I’ve referred you to the hospital, so they should know, but I won’t give your medical records to the social worker or to your non-profit disease charity, for example, because I don’t feel that I can justify that on behalf of the patient. Now the patient may say, for example, I need my social worker to know or I need my charity to know because they’re helping me even more than you are. If you ask the patient, they would quite happily facilitate that sharing instantly. Unless you put the patient in control, you end up with under-sharing as well as the over sharing.

Deborah:  I completely agree. By the way, who is the person most likely to know who needs their information to help them?  It’s you, it’s not somebody else. It absolutely isn’t someone else and over here in the U.S.A., we have developed one-size-fits-all records. That’s another critical thing about technology: It allows tremendous individual customization. That was the genius of Steve Jobs. Those little iPods could have your own playlist of whatever you wanted to hear and somebody else could have an entirely different playlist. Technology makes individual customization easy, and we need customization so we can share what we want with specific other people. There are so many reasons why patients should control the information. A lot of it’s to do simply with error rates. Do you know the error rate in the Social Security number database in the U.S. where there’s one number that they’re keeping track of? It’s pretty significant. I think it’s something around  4%. That’s a lot of people! That’s with only one key number, so multiply that by thousands of data fields in electronic records. How correct do you think EHRs are going to be?

Mohammmad:  You’re saying that from three hundred million people in the U.S.A., we could plausibly expect that at any one time, 12 to 24 million of them have the wrong Social Security number on their medical records?

Deborah:  That’s right.

Mohammmad:  That’s an enormous number of people.

Deborah:  The thing is people seem to imagine electronic records are somehow different and more accurate than paper documents, that they are always correct, true and honest–but they aren’t.

Mohammmad:  Those figures with the Social Security numbers are interesting. First of all, physicians will say that the patient is going to add a whole bunch of errors to my record, (and they say it’s my record!). One of the things we showed them is your record is full of errors and the only person with the time and interest and skills to fix them is the patient. So rather than saying they’re meddling in my records, say, thank you for fixing the errors because that saved me from malpractice.

Deborah:  Absolutely. The people that actually know what’s wrong with them are the patients. The idea that physicians worry that someone’s going to change some kind of significant data is not a problem because changes should all be tracked. Here’s the thing: If the patient wants to keep some information from you why shouldn’t they be able to? This really gets back to the issue of trust. I’m not likely to tell you something very sensitive about me if I don’t even know you. There’s a lot of very, very sensitive information I believe patients should be able to block out, “segment”, whatever data they want, from the records sent to a physician that they don’t know yet. Even if they know a physician they should be able to share only the data they want to share.  Physicians are trained to ask about relevant information that may be forgotten or patients don’t realize is relevant. Medical school teaches differential diagnosis and considering alternatives and which questions to ask.

In the U.S., there’s a debate over situations where some health information is segmented and the record is sent to a new physician. Should that record be flagged to show that something is left out? The answer to that is, ‘no’. If you put a flag on a record, the first thing a physician is going to say to the patient is, ” what did you leave out?”

The other mistake that physicians make is believing that records are truer or more accurate than what people tell them.  The idea that records are more accurate than what patients know or tell you about was never true for paper records and it’s not true with electronic records either.  The idea that the person in this transaction that doctors can’t trust is the patient is kind of insane if you think about it. That’s the one person you ought to be talking to and you ought to believe first and foremost. You’re required by the way to make your own diagnosis in this country, because records have errors, physicians make errors, and many diagnoses and conditions evolve over time.

Let me just talk about my specialty. In mental health it’s very well known that bi-polar disorder typically takes 10 to 15 years to correctly diagnose.  So what do you think that means?

Mohammmad:  Fifteen years from the first time you see the patient?

Deborah:  Let’s say 5 to 15.

Mohammmad:  I’m sorry but that’s an incredible number.

Deborah:  It’s much longer than anyone thinks. Everyone thinks oh, your doctor can tell immediately what’s wrong with you. No, actually in many cases they can’t. Not only that, they can get it wrong, so for a long period of time you’re likely to have the wrong diagnosis in your records. Therefore, as soon as you get the correct diagnosis you better be able to eliminate all the wrong ones so that there’s no confusion going forward. It’s matter of patient safety. There are plenty of misdiagnoses and erroneous diagnoses that follow people around. There’s a famous e-patient here in the U.S., you might have heard of him, Dave DeBronkart–I believe he’s a technology engineer. Dave got cancer; he got a data dump of his medical records from Harvard. They were filled with errors, despite the fact he was treated at one of most famous hospital systems in the world, so he’s become a key advocate for being able to get copies of your own records, to be able to correct them.

Then there’s medical identity theft, which takes two or three years to discover in this country. It is the name for the situation where somebody steals your insurance card or numbers, presents themselves at a hospital, treatment facility, or doctor’s office in order to get services and electronic records are created in your name by someone else.  Picture what happens when that data is merged with your real data and, by the way, data holders all want to do the merging and matching for us, rather than allowing us to collect our information and sending it from doctor to doctor ourselves, so we know they have the correct information.

Medical identify theft can cost $20,000.00. You can’t recover from it as easily as you can from credit identity theft. Erroneous electronic medical records are going to be floating out there always, forever threatening to create problems for you in the future because you can’t delete them. You don’t even know where they all are. Typically insurers actually blame you, and either raise your premiums or drop you. It’s a nightmare when patients don’t have access to copies of all of their health information and no control over the use and disclosure of their health information. The healthcare system’s going to be full of electronic records filled with errors and for those who want to use it for breakthrough research, how well will it work when there are significant errors in them?

In the U.S. at least one-in-eight persons every year hides information because they know it’s not kept private or that it does not stay in the doctor’s office. One-in-eight Americans withholds information or actually lies to physicians, so that’s another problem if data is used for research. Actually the most damaging outcome of not being able to control personal electronic health information is 600,000 people a year avoid early diagnosis and treatment for cancer because they know the records aren’t private. Another 2 million a year avoid early diagnosis and treatment for depression. Millions more avoid early diagnosis and treatment for sexually transmitted diseases. In total, 40-50 million patients in the US risk their health and lives every year to try to keep sensitive personal health information from widespread hidden use, disclosure, and sale.

We say right up front, “Okay, all of you designing these systems, here are facts: millions of people aren’t getting treatment because they know they can’t control the hidden uses of their data. This causes bad outcomes. In healthcare do we want systems that produce bad health outcomes? The whole point of these systems was to improve health outcomes.” We think that’s probably our strongest argument.  It’s not just unlimited hidden discrimination in jobs, credit, and insurance, but the fact that sick people won’t get help, and they can die trying to protect the privacy of their health data.

Mohammmad:  I love the numbers that you’re bringing up. Where can we get some of the latest studies that you’ve been publishing on this because I want to link to them.

Deborah:   There’s a lot of information on our website, www.patientprivacyrights.org.  In particular, we have a white paper there about consent. It’s on our website and it’s called ‘The Case for Consent, Why It’s Critical to Honor What Patients Expect for Healthcare, Health IT and Privacy’, by Patient Privacy Rights. It’s on our website under ‘What we do’, ‘Policy and Development’, ‘Reports and Studies’ at: http://patientprivacyrights.org/?s=case+for+consent.

For example, it includes all the figures and citations about the people who avoid care. The famous California healthcare study about one-in-eight people hiding information is there or at: http://patientprivacyrights.org/?s=polls#CHCF. It also has the figures from the Rand Corporation monograph about how the lack of privacy affects people in the military called “Invisible Wounds of War”, page 436 (2008) or you can download it at: http://www.rand.org/content/dam/rand/pubs/monographs/2008/RAND_MG720.pdf. I don’t know if you know this, but in the military there is no privacy of healthcare records. Your superiors can always look at them to determine if you’re ready for battle.

Mohammmad:  Oh, wow, really?

Deborah:  Yes. Throughout my career I’ve seen members of the military who came and paid me privately so that their records wouldn’t be in the military health record system.

You probably know this sad statistic as well, either in 2010 or 2011 the number of military people, active duty or veterans, that committed suicide exceeded the number killed on battlefields.

Mohammmad:  Yes, I just heard that last week, yes.

Deborah:  That is not due only to the lack of privacy of records. Obviously, other factors also contribute: there’s a very macho culture in the military; it’s considered a sign of weakness to need help, and it also may affect promotions. The people above you in rank can see that you’ve seen a psychiatrist and you may not get promoted. In fact, psychiatric treatment is often used to block high security clearances. So the lack of totally private mental healthcare inside the military health system is a contributing factor to the high death rate from suicide among military and veterans.

However, the armed forces are trying to grapple with this sad situation because not only is the lack of privacy of military records a problem in cases of post-traumatic stress disorder, depression and traumatic brain injuries, but the fact that American health records are not secure is a national security disaster.  The military is very, very interested in the protection of Americans’ health information because so many prominent state and national leaders and politicians have military health records.

Mohammad:  That’s fascinating. The issue of patient privacy extends to so many domains, and it’s been a pleasure learning more about your work. Thanks again for your time!

Deborah:  You’re welcome.



Viewing all articles
Browse latest Browse all 23

Latest Images

Trending Articles





Latest Images