Speed through distributed parallelism

daviding's picture

Although it may seem obvious that greater speed would be expected from smaller teams in collaboration than with larger teams, it seems as though our information technologies have generally not supported this.

I had first noticed a TED video (posted Sept. 2012) where Clay Shirky was focused on open government, but then starting talking about the way in which Git works.  Git is essentially a Finnish technology, invented by Linus Torvalds immediately after Linux, as a way of managing coordination.  The video can be found at http://www.ted.com/talks/clay_shirky_how_the_internet_will_one_day_transform_government.html .

More recently, I found a Realtime Conference video (from Oct. 2012), where Ward Cunningham (inventor of the first wiki in 1995) describes a new project on Federated Wiki, that aims to remove some of the roadblocks that deter innovation in Wikipedia.  That video is can be found at http://vimeo.com/52637141 , although you may prefer to quickly read my text digest of highlights.  Wired had written an article "Wiki Inventor Sticks a Fork in his Baby" in July 2012.

The idea of forking content -- starting from a common base and working in parallel -- suggests that contributions will be negotiated, so that advancements can be merged into an improved revision.  On the other hand, if a fork is not valuable for the original project, but valuable in another context, there's the opportunity for a subgroup to take that revision in another direction, enabling a potential continuation of creativity.

The idea of forking and merging educational content is surfaced as a "Fork, Merge & Share" blog post by Greg Wilson.  This suggests that merging content after the fact could be rather difficult.  If the architecture of collaboration is designed, as Ward Cunningham suggests for publish-then-review rather than review-than-publish, the cycle of feedback could be a lot faster.

[This blog post was originally published on Rendez Central]