ICYMI: June’s #maketechhuman Debate Tackled Google, NSA, and Cyberwarfare
Great technology always seems to weave its way seamlessly into our lives without us giving it a second thought. We originally adopted smartphones with no sense of the life-altering conveniences they’d offer–such as untethering us from our homes and offices–and for many, it’s now hard to imagine life without one.
But tech is no panacea for humanity’s ills, and there are plenty of nefarious cases out there (think Sony, NSA, Heartbleed) to make some wonder if its evil side isn’t gaining in influence. Last month #maketechhuman spoke to a handful of top privacy and security experts about the very real problems that originate from our technology, and what we should do to solve them.
In-Q-Tel CISO Dan Geer, for instance, shared his thoughts on why the U.S. government should aspire to wipe out the world’s stockpiles of cyber-weapons, and how to make software more secure by requiring it to have liability policies. Not everyone agreed, of course. One commenter noted: “Making software developers choose between product liability and open source would destroy a bunch of investment in new software.”
At the other end of the philosophical spectrum, former NSA General Counsel Stewart Baker showed why he is a tireless champion of government-managed cyber-surveillance. Just before the U.S. Congress voted to let section 215 of the Patriot Act expire in early June, Baker argued why the move would be a dangerous decision. Again, readers voiced opposition. One wrote, “So you want to collect all this metadata? Fine, great, just add an amendment to the constitution modifying the 4th amendment. I think it’s pretty safe to assume that most U.S. citizens would find this an unwarranted search.”
Whether or not governments should collect and store personal information, there’s no denying that Internet companies are built on a solid foundation of human data. Isabelle Falque-Pierrotin, current head of France’s data protection watchdog CNIL, demands that corporations be stopped from collecting people’s data without consequence. In June, CNIL issued a 15-day ultimatum to Google: Extend the “right to be forgotten” to all of its sites—not just those in the EU—or CNIL would impose sanctions. Falque-Pierrotin wants to turn corporate privacy from an unregulated afterthought into a mandatory default setting.
Outside of the privacy and security debate, #maketechhuman caught a glimpse of technology’s potential to solve some large-scale problems. Patients with rare diseases, long ignored by Big Pharma, are finding answers and treatments from nascent crowdsourcing platforms. And while governments often prove inept at tackling humanitarian issues, new models for “global solutions networks” are tapping into tech to help people in dire need.
Of course it’s not just on #maketechhuman that influencers are grappling with the big issues at stake, such as the fate of our relationship with technology. Pope Francis, in his remarkable climate-change manifesto, echoes much of the #maketechhuman spirit:
“Humanity has entered a new era in which our technical prowess has brought us to a crossroads. We are the beneficiaries of two centuries of enormous waves of change: steam engines, railways, the telegraph, electricity, automobiles, aeroplanes, chemical industries, modern medicine, information technology and, more recently, the digital revolution, robotics, biotechnologies and nanotechnologies. It is right to rejoice in these advances and to be excited by the immense possibilities which they continue to open up before us, for science and technology are wonderful products of God-given human creativity.”
Now it’s your turn to respond. Is the Pope a Pollyanna? Do you agree with Baker’s idea that we need more cyber-surveillance? Is Falque-Pierrotin right in demanding more privacy controls from Internet giants? How do you think technology and humanity are helping each other improve? Share your thoughts in the comments and on reddit. #maketechhuman
Source : wired
We Are Fossasia Stay Connected With Us On Twitter . . . ! ! !
No comments:
Post a Comment