It Takes Community to Save the Planet

The power of community is grossly underestimated in this country. Current collaborative trends on the Internet, from Web 2.0 to Wikipedia models, are seen as merely social or arguably informative in nature. That’s a bit like saying the U.S. Constitution, another collaborative work, is a fetching flight of fancy thoroughly detached from real world application. Such a claim may be attempted, but it doesn’t make a dent in the enduring truth of a multigenerational community commitment.

In effect, community works on the Web are powerful ideas in action that soar past geographic boundaries and political restraints. In a way, they are the very basis of what is bound to be the world’s largest and most peaceful revolutions.

Government vs. Science

Case in point: The Bush administration and the right wing (at least the very far right fringe) have made headway in establishing creationism museums and introducing the religious take on the Earth’s beginnings to science classes in some schools despite the worldwide scientific findings that document the evolution of all things earthbound. Other sciences, too, have suffered from a political choke hold in the U.S. from stem-cell research to new energy source development and climate change studies. In general, science projects of all types have suffered deep budget cuts and a strong demotion in societal standing.

Despite this oppressive environment, science lives on and is thriving in unexpected places due largely to the newly formed partnership between the everyday Joe and the most lauded of scientists. In the end, it may very well be the power of community that saves the planet.

Take, for example, BOINC, a project that develops open source software for worldwide volunteer computing. The BOINC model has two aspects:

  1. Scientists who need lots of computing power can use BOINC to create their own project. Projects are independent; there’s no central authority. There are roughly 50 projects, in all areas of science;
  2. Computer owners can donate computing power to projects by installing the BOINC client program, then “attaching” the program to one or more projects. After doing this, the client program will automatically download programs and jobs from these projects, and will run these programs in the background. BOINC uses various techniques to ensure that the perceived performance of a PC isn’t impacted, even when the PC is being heavily used for scientific computing.

Projects get computing power only if they can attract and retain volunteers. To do this, they must publicize and explain their research; they must educate the public about their area of science, and convince the public that their particular methods have promise.

“So we’re trying to accomplish two related things,” David Anderson, director of the BOINC project and research scientist of Space Sciences Laboratory at the University of California, Berkeley, told LinuxInsider, “enable new science by supplying huge amounts of computing power to scientists who otherwise could not afford it, and get the public interested and involved in science, as a side effect of making an informed decision about how to ‘invest’ their computing power.”

Plenty of Room

The cool thing about collaborative efforts, such as BOINC, is that there is always room for another participant — even the government. The National Science Foundation (NSF) funds BOINC.

“BOINC is an example of NSF’s continuing investment in middleware, which links computer-based resources to individuals and teams, increasing the usability of advanced digital resources, and empowering citizens from around the world to participate in the modern scientific enterprise,” Lisa-Joy Zgorski, spokesperson for the office of the director and office of legislative and public affairs at the National Science Foundation, told LinuxInsider.

BOINC is a powerful, mature software platform with many facets. Anderson shared two in particular that warrant highlighting:

  • It uses cryptography and other techniques to make it safe to volunteer; e.g., even if hackers break into a project’s servers, they won’t be able to use BOINC to distribute malware.
  • It has mechanisms that give projects a high level of assurance of the correctness of computational results, even if some of the volunteered PCs return erroneous results (accidentally or intentionally).

Volunteer computing is wildly successful; there are about 580,000 PCs participating around the world, according to Anderson, and together they supply about 1.1 PetaFLOPS of computing power — about three times the largest conventional supercomputer.

“I think this is the tip of the iceberg; if we can figure out how to publicize and market volunteer computing, we should be able to grow participation by 10 times or 100 times,” says Anderson.

Not Just Computing

He is also involved with “distributed thinking,” which uses people rather than PCs to help scientific research over the Internet (e.g., by annotating images, sorting data and doing other tasks that human brains do better than computers).

Anderson also is involved in a project called “Bossa,” which is analogous to BOINC, that is developing software to help scientists do just that, he explained.

There’s another large initiative, the Encyclopedia of Life (EOL), which shares Anderson’s way of thinking.

The EOL is dedicated to assembling information about the world’s plants, animals and microorganisms. The basic entries in the encyclopedia are species. Since there are about 1.8 million known species on Earth, there will eventually be at least 1.8 million species pages. In addition, as new species are described, these will be included in the encyclopedia.

“Our goal is simple: To provide a single portal that leads to a Web site for all species,” David Patterson, of the biodiversity informatics group at EOL, told LinuxInsider. “The system will be forever growing and evolving, it will be sufficiently flexible to allow different users to see information in different ways, and it will allow users to distinguish those pieces of information that have been authenticated from those that have not.”

Compiling the Data

Scientists assemble and authenticate information about species so that users can be certain that the information on the authenticated pages is of high quality. In addition, the general public will soon be able to send in their own information (such as photos and observations) about species. This unauthenticated information will also be presented to the public, in a separate place on the pages, and the scientists who are in charge of each page (the EOL calls them the “curators” for the page) will examine this publicly supplied information and incorporate the scientifically credible information onto the authenticated page.

“The project works because it is a creative collaboration between the scientific community and the general public,” James Edwards, executive director of EOL, told LinuxInsider.

Currently, the EOL only has authenticated information in it. The capability for the general public to submit new information will be incorporated into the EOL by the end of 2008.

The EOL project is made distinctive by a unique approach that combines a new way of indexing and organizing information about species. Edwards refers to this as “taxonomic intelligence.”

Taxonomic intelligence overcomes some problems of how to bring together information on the same species even when the original information uses different names. Edwards says they refer to it as “taxonomic intelligence” because they are trying to emulate the way that taxonomists (the scientists who describe and name the world’s species) work.

Secondly, having identified pieces of information as referring to the same species, they then use aggregation techniques to bring together information on the same species and same subject from many Web sites.

“We always seek permission before we do this, and give credit to the authors and providers of information,” says Edwards.

Finally, the curators of a page use wiki techniques to edit the pages.

For Starters

The first release of the EOL portal is a pilot project. It builds upon collaborations with existing Internet projects to show that a truly comprehensive encyclopedia can be developed.

For example, the underlying classification of life was provided by the Catalogue of Life partnership; the Tree of Life Web provided descriptions of more-inclusive groups of organisms (e.g. mammals, whales, great apes); and FishBase allowed the use of its comprehensive information about the world’s 30,000 species of fishes.

“The existing portal is thus a proof of concept,” explained Edwards. “It contains detailed information on only about 35,000 species, mainly fishes, some amphibians, a few plants, a smattering of other organisms, and about a million ‘minimal species pages,’ in essence, these are placeholders into which information will be put as EOL is able to authenticate it.”

Through its partnerships with scientists and the general public, EOL will fill in the information on these minimal species pages. Edwards estimates that it will take about a decade to assemble basic information on all 1.8 million known species.

Never-Ending Story

“But the EOL will never be finished — as long as we are still learning about those organisms that inhabit this world with us humans, we will always be updating and adding information to the Encyclopedia,” he says.

As with everything, critics question the validity of such projects. “Can the Internet contribute to community efforts to save the planet? Yes, of course it can, but at the end of the day the action must be taken in the real world,” Peter Giblett, a former CIO based in Canada, told LinuxInsider. “The chronicling of all of the world’s species is NOT an action that will save any endangered species on its own. It may record it, but that is all.”

Edwards counters with the argument that awareness is always a first step in any curative action.

“Public participation may range from offers of content — such as images, videos or sounds; sightings of species, artwork, descriptions, critical comments, software, or ideas for the future. We believe that by engaging people they will develop a greater sensitivity for the living world around them, and increase their sense of custodial responsibility for that community. That is, the participatory model democratizes a dimension of the ‘save the world’ efforts,” he says.

Anderson also believes there is real power in public participation. “What do you think is important? Addressing global warming? Curing cancer, Alzheimer’s, malaria, AIDS? Developing alternative energy sources? Understanding the origin and destiny of the universe? Discovering life outside Earth? There are volunteer computing projects in all these areas, and anyone who owns a PC has an opportunity to help, and to make a difference,” he explains.

That’s a lot of power in individual hands. So how should one begin to responsibly exercise this power?

“Choose wisely,” advises Anderson.

Otherwise, jump in and do something.

Viva la revolucion!

1 Comment

  • Selling science is nothing new. A number have sold quackery and fakery, as well as, real science. Here is the problem. Around the turn of the last century a number of devices were purported to cure all man’s diseases with electricity (and magnetism). Seems they were a little ahead of their time, and a bit misguided. There are certain beneficial effects to the human body using electricity and magnatism. There are also important diagnostic tools using electricity and magnatism.

    Which salesman will succeed will be interesting, as "the rest of the world accepts" a number of ideas as "scientific fact" when they are not. We are a gullible group of beings. A tad delusional too.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

LinuxInsider Channels