Archive for 2008|Yearly archive page

P2P Energy Production (Smart Grid) and P2P Web

In Uncategorized on September 9, 2008 at 9:49 pm

Author: Marc Fawzi

Twitter: http://twitter.com/#!/marcfawzi

License: Attribution-NonCommercial-ShareAlike 3.0


In the future, everyone will be an energy producer and consumer. Everyone will produce their own energy and either sell the surplus to others or buy extra wattage from others.

That’s part of the premise and promise of the “smart grid” aka “intelligent utility network” aka “Intergrid.”

See this: http://www.odemagazine.com/doc/56/talkin-bout-my-generation/2

So if everyone can be a producer and consumer of energy then everyone can also be a producer (not just a consumer) of Web infrastructure, starting with people owning Mesh/802.11s-enabled wireless routers and all the way to people owning and renting out P2P-enabled storage, processing power and connectivity.

Where do today’s dominant Web players fit in such a scenario (e.g. Google)?

Answer: nowhere, as far as I can see.

Google is the biggest private consumer of energy. They may also be the biggest producer of energy one day. But I’m betting that such a day won’t come; i.e., that we will move to a P2P (or edge-driven) consumer-producer model, or P2P Economy, and away from the network -or cloud- centric model.


  1. Towards a World Wide Mesh (WWM)
  2. P2P Energy Economy


People-Hosted “P2P” Version of Wikipedia

In Uncategorized on July 23, 2008 at 1:29 pm

Author: Marc Fawzi

Twitter: http://twitter.com/#!/marcfawzi

License: Attribution-NonCommercial-ShareAlike 3.0

Wikipedia and Web 3.0

Problem Statement:

The New York Times’ Web 3.0 article on Web 3.0 from last year, which is basically a re-wording of the popular Evolving Trends’ Web 3.0 article that came out five (5) months before it, was accepted it into the Wikipedia entry on Web 3.0 while a Wikipedia admin (or zealot) rejected the inclusion of the Evolving Trends article on the basis that it is a blog entry, i.e. insignificant, despite the fact that the Evolving Trends article was read by 211,000 people (to date) and quoted by hundreds of people, which probably makes it the most read blog article about the Semantic Web to date.

So it is disturbing that arbitrary rules, which are often applied arbitrarily, can be exploited to put the “privilege-to-dictate-what-qualifies-as-knowledge” ahead of the right of the public to complete, well-rounded and uncensored knowledge.

Why was a copy-cat article about Web 3.0 authored by the New York Times more significant than the original blog article, which preceded the New York Times article, and which was read and quoted by a very significant number of people…?

In other words, what makes the blog a lesser medium than a newspaper, especially one that has had several ethics breaches including plagiarism?

Let’s say that I had agreed to publish the Web 3.0 article in question in a well respected academic journal, which I had been contacted by (through a contributing editor at Stanford University), then would it have made the ideas any more legitimate?

That proof of quality comes from the relevance of the subject to the people. Today, two years after I blogged “Wikipedia 3.0: The End of Google?” (the first article to coin the term Web 3.0 in connection to the Semantic Web, AI Agents and Wikipedia ), we can find many Semantic Web startups today that are applying semantic technology to Wikipedia. Before the publication of the article, there were not one startup and no mention of Wikipedia in the context of the Semantic Web, although the Semantic Mediawiki guys were working in that direction already.

Is it by pure coincidence that after the huge popularity of the Evolving Trends Web 3.0 article we now have not one or two but several startups and groups working on applying semantic technology to Wikipedia? including PowerSet, a startup that was recently acquired by Microsoft, and, not surprisingly, Wikia, a commercial venture led by Jimmy Wales, Wikipedia’s founder. In addition, when the Evolving Trends Web 3.0 article was published it became the top blog post on the Google Finance front page (and stayed there for a few of months) when people searched for GOOG (Google’s stock symbol.) So I’m sure Google’s investors and management did notice it, so it’s not hard at all to think that it also had _some_ influence on Google’s decision to build a Wikipedia competitor, although there is no way to prove it did.

All of this leaves me wondering why the Evolving Trends Web 3.0 article was removed from the Web 3.0 entry in Wikipedia? After all, it was the first article to coin, in highly publicized manner, the term “Web 3.0” in conjunction with the Semantic Web and Wikipedia.

When the rules are arbitrary, and when they are applied arbitrarily, it’s impossible to tell the reason or the motive behind the reason.

The whole affair is not a single isolated case. It has happened and is happening regularly to many well-known bloggers and authors, yet it has its unique circumstances and unique flavor (or personal experience) in each case.

Thus, I feel that based on my experience and the experiences of many others that there is a real flaw in the governance model of Wikipedia, which needs to be fixed, or else risk being exposed more broadly to the tyranny that comes with the arbitrary dictation of the truth and the rewriting of history in a way that fits the agenda of those with power and influence, who can rewrite history at will by dictating what gets written about any events, which in my particular case happens to be the highly publicized first coining of the term “Web 3.0” in conjunction with the Semantic Web and Wikipedia itself, which is no where to be seen on Wikipedia!

The Solution is P2P:

The best fix, IMO, is to replace Wikipedia with a distributed “P2P”-hosted encyclopedia that allows multiple versions of any given topic, from different authors, which would be rated by the users.

Eventually, or as the second logical step, we would need to apply a democratic model rather than rely on the unwisdom of the crowds. In other words,  are the Wikipedia admins elected by the people? No, they’re not!  So what must be done in the proposed “people hosted” Wikipedia is to let the people (us, the users) elect representatives who would rate up or down the various versions of a given topic entry submitted by different authors (but who would not be able to delete or bury any of the versions.)

See the following for more on building a governance model:

  1. The Unwisdom of Crowds
  2. The Future of Governance


Wikia and Web 3.0

The hosting of the Semantic Mediawiki, i.e. the Web 3.0 version of of Wikipedia’s platform, has been taken over by Wikia, a commercial venture founded by Wikiepdia’s own founder Jimmy Wales. This opens up a huge conflict of interest, which is, namely, the fact that Wikipedia’s founder is running a commercial venture that takes creative improvements to Wikipedia’s platform, e.g. Semantic Mediawiki, and hosts (with potential to transfer) those improvements on Wikia, his own commercial for-profit venture. This shows poor judgment at best and an explicit conflict of interest at worst. And we’re talking about a key figure in Wikipedia’s governing body.


New York Times and Web 3.0

Here is the Evolving Trends article that was the first article to coin, in a very publicized manner, the term “Web 3.0” in the context of the Semantic Web, Wikipedia and AI agents:


And here is the Web 3.0 article by the New York Times that came five (5) months after the above-mentioned article:



  1. Wikipedia 3.0: The End of Google?
  2. The Unwisdom of Crowds
  3. The Future of Governance

P2P version of Twitter using Flash 10.1

In Uncategorized on May 22, 2008 at 3:54 am

A massively scalable, highly redundant version of Twitter can be built using the P2P feature of Flash 10.

For pennies, too.

The devil is always in the details but this is something that can be conquered now, thanks to Adobe.

Another way of saying it, building a massively scalable, highly redundant Twitter clone is not exactly trivial but it can NOW be done 800 times easier than having to write your own P2P layer not to mention having to ask people to download a PC client.

Adobe has just changed the game by making P2P technology accessible to all Flex/Flash developers.

Towards a World Wide Mesh (WWM)

In Uncategorized on March 8, 2008 at 6:34 am

Author: Marc Fawzi

Twitter: http://twitter.com/#!/marcfawzi

License: Attribution-NonCommercial-ShareAlike 3.0

The One Laptop Per Child (OLPC) project has brought some interest to mesh networking.

In theory, the XO laptop has the ability to form a wireless mesh together with other XO laptops in its vicinity. Each laptop extends the mesh further, like a link in a long chain.

Such mesh technology, when supplemented by signal repeaters, can theoretically cover entire villages. Since villages, towns, cities are connected to each other via the Internet these meshes come together in one world-wide mesh.

The Web and the Internet as a result is constrained (by many economic, political and technical factors) to working within the client-server model. The inability to establish direct communication between applications on different PCs, without having to go through difficult -and sometimes- unreliable paths (think: UDP hole punching, NAT traversal, uPnP etc), combined with ISPs’ tendency to throttle and even block P2P traffic has resulted in an unhealthy environment for P2P applications.

The XO laptop maybe the first sign of a global shift from the client-server model of the Web to the peer-to-peer model of wireless mesh technology.

The current Web architecture is bound to evolve over the next few decades as the architecture of global communication moves from the network-centric model to the peer-to-peer model, enabled by wireless mesh technology.


  1. World Wide Mesh