Saturday, December 5, 2009

Elavator Door




Don't forget to sign up for Random Thoughts updates via email in the right margin.

Wednesday, December 2, 2009

American Flag




From the multi-media mind of a four year old. This is a rendering of the Stars and Stripes.

Tuesday, February 17, 2009

This Blog Has Moved

Readers,

In an attempt to consolidate my online profile, this blog will no longer be updated. The content will remain, but all further postings will be made at Random Thoughts.

Please join me at Random Thoughts for random thoughts on technology, world affairs, pop culture, and maybe fantasy football. Thanks.

Tuesday, May 6, 2008

The Aggregator

What Is An Aggregator?
In my 4 April post I touched briefly on the aggregator. Essentially, an aggregator provides a consolidated view of content in a single browser display or desktop application, and in its simplest form draws that content from RSS (see my 4 April post -- Harnessing Collective Intelligence -- for more on RSS).

What does that mean in English? Well, if in your daily web routine you visit a number of different web sites for news, sports, entertainment, politics, blogs, etc. for your information fix, an aggregator can simplify this process for you. The aggregator allows you to "subscribe" to the content you want to receive and than consolidates that content in a user-friendly format. The below video is an okay tutorial on RSS and aggregators...



Different Types of Aggregators: Web Based or Client Software?

There are many different aggregators to choose from. For a running list of available aggregators visit Wikipedia.

In my brief experience with aggregators I have found that most have similar features. They allow you to organize your subscribed content -- much like you would your email inbox, provide a headline and the first few sentences of each individual item, link you to the original content if you want to read the entire item, and allow you to "tag" the content for future reference.

The biggest decision you will make associated with an aggregator is web or client. A web based aggregator is just that; it's accessible through a browser. If you have Internet access and a browser, and use a web based aggregator, you can access your aggregator from any PC. I am a web based aggregator user. I use Google Reader, but there are many others (see the Wikipedia link above). Below is a shot of Google Reader.


In contrast, a client based aggregator is software installed on your PC. In order to view your aggregator you must access it through your PC. A good client based aggregator I have used is FeedReader. Below is a shot of FeedReader.


Both versions of aggregators have their pluses and minuses. However, for me, being able to access my aggregator from anywhere is a big plus. Some will tell you that client based aggregators are faster than web based -- and thus more efficient -- but I have just not found this to be true. I started with Google Reader, tried FeedReader, and went back to Google Reader because I actually found FeedReader to be slower. Maybe it's my clunky HP laptop...

Additionally, another feature to look for is offline access. Most client based aggregators have this feature and I know that Google Reader has offline access, this may not be the case with all web based aggregators, though.

At the end of the day, even for the casual web surfer, the aggregator is a must for efficiently accessing, receiving and organizing content from myriad web sites. For me it's a keystone for effectively managing my web experience.

Thursday, April 17, 2008

Finishing Up O'Reilly's Seven Web 2.0 Principles

In earlier posts I discussed two of the O'Reilly Web 2.0 principles ("The Web as Platform" and "Harnessing Collective Intelligence"). In this entry I'll discuss a couple more: "End of the Software Release Cycle" and "Rich User Experiences." I would also like to note two other O'Reilly principles, "Data is the Next Intel Inside" and "Software Above the Level of a Single Device," which I will not delve into. However, if you would like more information on these two principles please see the "What is Web 2.0" article I have been referencing for these posts.

End of the Software Release Cycle
What exactly does "End of the Software Release Cycle" mean? Essentially, the Web 2.0 era is a time of the constant beta -- software accessed through the internet with a web browser is always in the developmental stage. Take the Google applications as an example. I use Gmail, Calendar, Documents, and Reader and they are always evolving. However, the evolution is not intrusive -- meaning that it does not affect my ability to continue to use the tools. Most of the changes are transparent, but occasionally there will be a significant update that is easily apparent. Of the Google tools I use, Documents seems to change the most. See the below image from a Phil Lenssen 28 Feb blog post.

As you can see, the change here was a simple update to the tool bar in Google Documents. But this change highlights the continuous beta cycle of the Web 2.0 era. As O'Reilly states, "...one of the defining characteristics of internet era software is that it is delivered as a service, not as a product."

The Google Documents example is pretty powerful because, essentially, what you have with Documents is an internet mini version (word processor, spreadsheet, and presentation software) of the Microsoft Office suite -- albeit a less robust version -- that only requires a web browser, is constantly evolving (in beta form), is accessible anywhere there is internet access, and there is no long software development and release cycle. The service is always being improved and it's free. This type of development cycle is significantly different than what we are used to from the PC or client-server era, and will require a significant paradigm shift for software companies still in the design and release a product model -- as opposed to a software as a service model.

Rich User Experiences
This is another way of saying that web developers are now able to build web applications as rich as local PC-based applications.

Continuing with the Google theme, according to O'Reilly, Gmail was the first "mainstream" web application to deliver rich user interfaces and PC-equivalent interactivity. The combination of several technologies, which became known as AJAX, made this leap forward possible. Since Gmail was introduced a flood of other software service applications with rich interfaces has followed. According to O'Reilly:

"We expect to see many new web applications over the next few years, both truly novel applications, and rich web reimplementations of PC applications. Every platform change to date has also created opportunities for a leadership change in the dominant applications of the previous platform."

Using the Google Documents example again, essentially O'Reilly is saying that a platform change -- from a PC-based software PRODUCT (Microsoft Office) to an internet-based comparative software SERVICE (Google Documents) -- based on a rich user experience has the potential to cause a paradigm shift in the way we view and use computers.

At the end of the day, O'Reilly's Web 2.0 principles have offered a great start point for exploring the Web 2.0 phenomena. With these principles as a foundation, in future posts I will delve into some of the Web 2.0 applications and technology that I am discovering.

Thursday, April 10, 2008

Defining Web 2.0 (Videos): Three Different Definitions

I was dong a little research on YouTube and found a few videos I thought I would share.

1. This video is a short clip (50 seconds) of Tim O'Reilly "defining" Web 2.0. O'Reilly, as with everything else I have read by him, approaches a Web 2.0 definition from a business process perspective.



2. This clip (5:17) takes the information from the Wikipedia entry for Web 2.0 and presents it in video format. Nothing earth shattering, but for those of you who prefer the movie over the book, this is for you.



3. The final clip (3:00), which I found to be the most enlightening, is from Andi Gutmans, co-founder of Zend. Gutmans approaches a definition of Web 2.0 from more of a technical standpoint. His main Web 2.0 attributes are:
1. Rich Internet Applications (RIA)
-Flash
-Ajax
2. Service Oriented Architecture (SOA)
-Feeds
-RSS
-Web Services
-Mash-Ups
3. Social Web
-Tagging
-Wiki
-Blogging
-Podcasting


Monday, April 7, 2008

'The Grid': A New Internet, But on Steroids

I was surfing my Google Reader today and stumbled across a Sunday Times article titled, "Coming Soon: A Super Fast Internet" -- this peeked my interest. The intro to the article read, "The Internet could soon be made obsolete. The scientists who pioneered it have now built a lightning-fast replacement capable of downloading entire feature films within seconds. At speeds about 10,000 times faster than a typical broadband connection, “the grid” will be able to send the entire Rolling Stones back catalog from Britain to Japan in less than two seconds" [1]. Kind of like an Internet on steroids -- I wonder if Jose Canseco knows about this?...

The article went on to state that Cern, the particle physics center near Geneva where Tim Berners-Lee invented the web, started the grid computing project seven years ago (sorry, Jose) to support the Large Hadron Collider (LHC) -- the LHC is designed to probe the origin of the universe. Apparently, scientists working on the LHC project estimated that, once it goes on line this summer, the annual data output could be 56 million CDs worth of information. I'm not sure how much data that is, but it sounds like a lot. They determined that that data output might bring the Internet to its knees; hence, the creation of the grid.

While the Internet is created with a mix of cables and routing equipment originally designed for voice transmission, the grid has been built with dedicated fiber optic cables and modern routing centers. According to the article, 55,000 servers -- with a goal of reaching 200,000 within the next two years -- have already been installed. The grid connects Cern to eleven other centers in the U.S., Canada, the Far East, and Europe.

A simple Google search turns up a ton of information on this subject. Obviously, this is new to me, but it's not really new. For a few quick references I found, check out the below links:

-Father of the Grid
-Grid Computing (Wikipedia)
-Open Grid Forum

The notion of the grid interested me because of its potential to alter how we interact with the Internet. While the grid, as designed by the folks at Cern, has a specific mission to help process LHC data, and is not expected to be available to the public, its pioneering capabilities and technologies could influence a new Internet. After all, isn't that how we got to where we are today with the Internet? David Britton, a leading figure on the grid project, summed up the potential by stating, "With this kind of computing power, future generations will have the ability to collaborate and communicate in ways older people like me cannot even imagine" [1]. Can you say Web 3.0?...

For some reason when I first read the Sunday Times article my first thought was The Matrix. Not sure why -- maybe it was the whole grid/matrix similarity... I don't know, in any event, it has nothing to do with the grid but I thought I would through The Matrix trailer in for your enjoyment...



References:
1. Jonathan Leake (2008-08-06). Coming Soon: Superfast Internet