Sunday, 13 April 2014

Kroes: thoughts on NETmundial and the Future of Internet Governance

My thoughts on NETmundial and the Future of Internet Governance - European Commission: "I found some of the language related to human rights unnecessarily weak. I refer in particular to the passage "Internet governance should be open, participatory, Multistakeholder, technology-neutral, sensitive to human rights". We have an obligation to respect and promote human rights, not merely be "sensitive" to them, and this should be clearly reflected throughout the outcome document. This includes, among a number of important issues, the protection of privacy and personal data protection, which should have a prominent role in the outcome document.

Secondly, self-regulation and self-organisation of different stakeholders are certainly to be preserved and promoted. However, this cannot be to the detriment of basic democratic principles. It is not sufficient that the mechanisms through which "different stakeholder groups […] self-manage their processes [are] based on publicly known mechanisms", if this results in the explicit or implicit exclusion of persons in a manner that would contradict democratic processes." 'via Blog this'

Thursday, 13 March 2014

Web We Want | Celebrating the free, open, universal Web

Web We Want | Celebrating the free, open, universal Web: "Web We Want campaign is calling on people around the world to stand up for their right to a free, open and truly global Internet. The first step: Drafting an Internet Users Bill of Rights for every country, proposing it to governments and kickstarting the change we need. There are three ways to get started:



  • Add your name to the Web We Want mailing list to the right. We’ll keep you informed as our campaign begins to gather momentum.
  • Start a national dialogue about the Web that your country wants. 
  • Draft an Internet Users Bill of Rights for your country, for your region or for all. 


From national regulations to an international convention, we can work together to propose the best legislation to protect our rights.
Right now the U.N. is requesting an investigation into global online surveillance. As more and more people awaken to the threats against our basic rights online, we must start a debate — everywhere — about the Web we want." 'via Blog this'

Friday, 24 January 2014

Bad science: No, Facebook won’t lose 80% of its users by 2017

What the articles about this non-reviewed article do not state, is what to a lawyer might be the obvious. This is perfect ammunition for Facebook to deflect any antitrust investigation into its social networking domination - we're a virus with a cure, nothing to see here, we're not Google or MSFT. Priceless propaganda reinforcing the 'MySpace decay' mythology that Facebook encourages:
"Facebook and Myspace are vastly different contagions. For both networks, young people were the first to get the disease. But they were also the first to develop an immunity; even Facebook admits it’s beginning to lose its appeal with teenagers. Myspace, however, never evolved past the stage of infecting the young, whereas Facebook worked hard to bring on new demographics, from young professionals to senior citizens. And it’s still hungry for more, now targeting emerging markets in Africa, Asia, and Latin America. Facebook also benefited from the explosion of mobile devices in a way Myspace never could. By the time the iPhone 3G was released, Facebook had already overtaken Myspace in traffic. To use the authors’ terminology, the rise of mobile phones created a “new vector” for Facebook to spread. As of last Summer, 78 percent of its daily users were on mobile.
Finally, the authors based their projections in part on Google Trends. The study notes that Google searches for “Facebook” peaked back in December 2012 and have been falling ever since. But as the Guardian’s Juliette Garside says in her write-up, Facebook’s Google search slippage is likely due to an increase in users who access the site through its mobile app as opposed to typing “Facebook” into the Google search bar.
Despite the study’s flaws, it does pose interesting ways to think about how social networks grow and recede."
UPDATE: Facebook researchers have replied light-heartedly - but a more serious commentator Jesse Czelusta notes: "Perhaps the biggest hole in the Princeton "study" is the model itself--in the system of differential equations, 1) the "recovered" population can never be re-infected and 2) the larger the "recovered" population, the more rapid the decline in the "infection" rate. Strikes me as highly unrealistic. Not to mention that the paper pays no attention to network externalities, which are the real reasons Facebook is Facebook and MySpace is MySpace, and this is likely to remain the case until we run out of air in 2060. "

Thursday, 28 November 2013

Internet architects propose encrypting all the world’s Web traffic

Internet architects propose encrypting all the world’s Web traffic | Ars Technica: "A vastly larger percentage of the world's Web traffic will be encrypted under a near-final recommendation to revise the Hypertext Transfer Protocol (HTTP) that serves as the foundation for all communications between websites and end users.
The proposal, announced in a letter published Wednesday by an official with the Internet Engineering Task Force (IETF), comes after documents leaked by former National Security Agency contractor Edward Snowden heightened concerns about government surveillance of Internet communications. Despite those concerns, websites operated by Yahoo, the federal government, the site running this article, and others continue to publish the majority of their pages in a "plaintext" format that can be read by government spies or anyone else who has access to the network the traffic passes over. Last week, cryptographer and security expert Bruce Schneier urged people to "make surveillance expensive again" by encrypting as much Internet data as possible." 'via Blog this'

Keeping Secrets: Pierre Omidyar, Glenn Greenwald and the privatization of Snowden’s leaks

Keeping Secrets: Pierre Omidyar, Glenn Greenwald and the privatization of Snowden’s leaks | PandoDaily: "It’s especially worth asking since it became clear that Greenwald and Poitras are now the only two people with full access to the complete cache of NSA files, which are said to number anywhere from 50,000 to as many as 200,000 files. That’s right: Snowden doesn’t have the files any more, the Guardian doesn’t have them, the Washington Post doesn’t have them… just Glenn and Laura at the for-profit journalism company created by the founder of eBay.
Edward Snowden has popularly been compared to major whistleblowers such as Daniel Ellsberg, Chelsea Manning and Jeffrey Wigand. However, there is an important difference in the Snowden files that has so far gone largely unnoticed. Whistleblowing has traditionally served the public interest. In this case, it is about to serve the interests of a billionaire starting a for-profit media business venture. This is truly unprecedented. Never before has such a vast trove of public secrets been sold wholesale to a single billionaire as the foundation of a for-profit company." 'via Blog this'

Tuesday, 5 November 2013

CyberTelecom Blog: [NIST] Initiating Review of Cryptographic Standards Development Process

CyberTelecom Blog: [NIST] Initiating Review of Cryptographic Standards Development Process: "To ensure that our guidance has been developed according the highest standard of inclusiveness, transparency and security, NIST has initiated a formal review of our standards development efforts. We are compiling our goals and objectives, principles of operation, processes for identifying cryptographic algorithms for standardization, methods for reviewing and resolving public comments, and other important procedures necessary for a rigorous process." 'via Blog this'

Google's terms and conditions are less readable than Beowulf

Google's terms and conditions are less readable than Beowulf: "Richard Mortier, a lecturer in computer science at Nottingham, ran Google’s latest revision through the plug in and found it to have a SMOG score of 15.48. That means users need a GCSE-level reading age to understand it. According to Literatin, 43% of the adult English population would not be able to read the terms.
Texts with a SMOG value in this range require a reading age of between 15-18 if they are to be understood, so anyone hoping to wade through Google’s terms of service and make it out the other side would need to go in equipped with a pretty decent education.
In comparison, the epic Old English poem Beowulf has a SMOG score of 13.9" 'via Blog this'