Tuesday, September 18, 2007

Home Stretch – So what have we learned

Summary and Lessons Learned in CS 855 – Futuring and Innovation

This past eleven weeks has been fast and furious. From our July residency with Dr William Halal lecturing on futuring and the principles around his think tank “TechCast”, to our exploration of Web 2.0 and the tools that are now widely available, it has been a wild ride.

Web 2.0
One of the slickest Web 2.0 tools Animoto was posted by Ted in our class. It allows you to upload pictures and set them to music to create a video trailer. I have taken some pictures from our last Residency and created a short video.


Michelle was the first in our class to demonstrate Gabcast, with its easy of use to be able to record audio blogs just by using the common telephone. I have to admit it is sometimes easier to listen to a blog and it has a nice feature that allows you to download the file so you can listen to it while you are in the car.

We explored the many faces of Google Web 2.0 from Google Maps, Google Mail, Google Groups, Google Docs, iGoogle, and the list continues on. As you can see Google really dominates the Web 2.0. With their recent purchase of YouTube they are well positioned to lead the industry into Web 3.0 (the next Microsoft). The real concern with most of Google’s Web 2.0 tools is the storing of user data and how well users privacy will be protected. Pictures that I upload to share with my family I don’t necessarily want them cached on Google’s search engine. This should be a concern for most of the Web 2.0 services we reviewed in this class not just Google.

Second Life
The idea of a virtual multi dimension environment has endless possibilities. Starting out in 2003 it has grown to almost 10M residents. It now supports real time classrooms along with a complex economic structure. The value of being able to have virtual classrooms and not only interact with a Professor, but also with other students while separated by an infinite distance is the jewel of Web 2.0. The concept of Second Life has expanded the idea of collaboration with endless possibilities. The ability to use Second Live in a work environment to conduct virtual brainstorming sessions, where it is possible to have breakout groups, but still interact as a group. The ability to be able to bring virtual teams together that are physically located on different continents.

After reading “The Fortune Sellers” it became very clear that predicting the future is based more on luck than on science. The world around us is in constant chaos making most social systems unpredictable. The stock market is a good example of such a system. Very few market analysts are able to predict the direction of the market with any degree of consistency [1]. Areas that can be founded on hard science can achieve a higher degree of accuracy if engineering principles are applied.

Failed Predictions
Often failed predictions can be attributed to changing conditions, lack of supporting data, the lag time from the prediction to the event being too great, or inexperience of the person making the prediction. Some social systems due to their complexity make any form of prediction difficult and with a high level of uncertainty.

My failed prediction was Ford Motor Company’s flying car in the 50’s. The reason the prediction failed was due to government regulations imposed by the FAA. The air traffic infrastructure would not have been able to handle the air traffic if flying cars had gone into mass production. Matt commented that the prediction may have not failed, but been delayed and that flying cars may still become a reality. I believe that the problems that prevented flying cars from becoming a reality in the 50’s are still present today. This is likely to be an inhibiting factor until air traffic management can be fully automated and the human element removed.

Successful Predictions
One of the most widely known predictions is Moore’s law, which states that circuit density on integrated circuits will double every 18 months. This prediction was made back in 1964 and has remained true for the past 40+ years. If we look at the circumstances around this prediction we will discover that this prediction was made quickly without much thought. Gordon Moore had been asked by an electronics magazine to predict semiconductor progress over the next 10 years. The present state of the industry was 30 circuits per chip and he knew the prototypes he presently had in the lab where at 60 circuits. Taking an educated guess he expected the density in 10 years to be 600,000 circuits, a thousand-fold increase which turned out to be very precise.[2] It is amazing that one of the most famous predictions of the 21st century and it could have been surmised on the back of a cocktail napkin.

This class on Socio Technical Futuring has been one of the most inspiring courses in the doctoral program. The art of prediction was truly fascinating. It revealed what most of us already expected, that in a lot of cases the expert is no better at making predictions than we are. Some social environments are too complex to predict with the tools [1] that are available. What we should recognize is that we can only successfully predict with any degree of certainty what we have under our control. The more uncertainty, the greater the margin of error we are likely to encounter.

[1] William A. Sherden, “The Fortune Sellers: The Big Business of Buying and Selling Predictions” John Wiley & Sons, (1998)
[2] InfoWorld CTO Forum, San Francisco, (2002), http://www.intel.com/pressroom/archive/speeches/gelsinger20020409.htm

Mark's Top 5 Predictions

1. Prediction – In the near future all electronic communications will be computer to computer. The only difference will be if it is person verse a computer wanting to communicate.

2. Predictions – In the next 10 years the military will adopt computers that are integrated into their clothing and within the next 15 years we will see this technology become popular in the civilian sector.

3. Prediction – The Internet will extend itself through personal peer to peer networks to allow connectivity down to any person, anywhere, and any time. All communication networks that we know now, telephone, TV, and cell phones will become IP based.

4. Prediction – Computing will become ubiquitous through the use of personal computers we will wear. We will use this interface to connect to network based computers for additional process ability or information retrieval. This will become possible through use of encryption and advanced authentication methods.

5. Prediction – Within the next 75 years we will see advances in how we generate and transport electricity based partially on research done by Nikola Tesla and now being investigated by MIT.

Sunday, September 16, 2007

Analyzing Successful Predictions

We are consistently reading about predictions from weather forecasts, to the stock market, to the cost heating oil this winter. In every major field there is someone that is making a prediction and trying to convince us based on past historical trends, alignment of the planets, or that caterpillars have hairy coats this fall. Statistical analysis of these forecasts has shown at best having about 50 percent chance of being correct while others rank with winning the lottery.

Most mutual fund managers don’t perform any better or as well as the SP index [1]. In fact most correct predictions are correct only by luck or the original prediction was incorrectly stated in the beginning.[2]

The World Weather Organization reports that the global weather forecasting budget to be approximately $4B per year. Yet long range weather forecasts are at best 50% correct. What has been shown is as the lag time between the prediction and the event decreases the greater the accuracy of the prediction. [2]

So what makes one prediction better than the next? In what sectors do predictions tend to be forecasters of the future? This question is an age old question that has been pondered over the decades. As with any prediction it should be viewed with caution and judged on the merit the prediction is founded on. If prediction is based on the readings of sacrificed goat entails it should be viewed with much caution. If it is based on hard scientific research then maybe more thought should be given it. In “The Fortune Sellers” the author lists 5 questions that should be asked: [2]

1. Is the forecast based on hard science?
2. How sound are the methods?
3. Does the forecaster have credible credentials?
4. Does the forecaster have a proven track record?
5. To what extent is my belief in a particular forecast influenced by my personal beliefs and wishful thinking?

Even after reviewing these questions it is still important to ask yourself if current events have been factored into the prediction. They may be predicting a mild winter so the expected price of heating oil may be forecasted to be low, however if there is a reported oil refinery closure due to a fire, the end resulted may be high heating oil prices. The longer lag time on a predication, the high the probability that an external event may impact the prediction.

These questions should apply to any prediction, whether it appears in a local news paper or is part of a strategic report from Gartner that your company paid for. Be leery of predictions based on “group think”. Just because a group of analysts’ support a given predictions, doesn’t mean that it is correct. They may all read the same market research report. Look for supporting data from different research sources.

In an attempt to test this approach I have selected a couple of predictions that have been proven with time to see if this model is supported. The two predictions are Moore’s Law and Elaine Garzarelli prediction of the 1987 stock market crash.

If we look at the circumstances around Gordon Moore’s famous prediction of circuit density doubling every 18 months we will find an interesting story. He had been approached by electronics magazine to predict semiconductor progress over the next 10 years. The present state of the industry was 30 circuits per chip and he knew the prototypes he presently had in the lab where at 60 circuits. Taking an educated guess he expected the density in 10 years to be 600,000 circuits, a thousand-fold increase which turned out to very precise.[3] There appears to be good amount of luck in this forecast, but it was extrapolated from the best data available at that time. If we apply the 5 rules we previously discussed, steps 1 & 2 fail, step 3 is met, step 4 would probably fail (insufficient information), and step five would probably not be a weighting factor. So to summarize 1 pass, 3 fails, and 1 unknown, however the prediction was true and still holds true after 42 years. In the case of Gordon Moore’s prediction, he was an expert in his field, he did have credible data to extrapolate from, and he had a good share of luck.

For the next case to analyze I draw from a case study cited in “The Fortune Tellers”. Elaine Garzarelli had developed a model to predict the direction of the market as either being bullish or bearish. After tracking the stock market for 17 years her model was showing 92% bearish, the most negative it had ever indicated. She went on Cable News Network’s “Money Line” predicting an “imminent collapse of the market”. Four days later the DOW crashed. She was immediately declared a “Market Guru”. However, she was able to predict the market crash, but was not able to predict its recover. For the preceding 9 years after her successful prediction she was only able to correctly predict the direction of the market 5 times out of 13 calls. If we look at the steps we can say that the model had been developed and tweaked for 17 years. It was based on accepted scientific methods, but not necessarily proven theorems. So we would have to say that step 1 failed, steps 2 & 3 passed, step 4 is unknown, and step 5 would depend on your feeling about the health of the market. So again the 5 step evaluation model sends an unsure message if to accept or reject the prediction.

To summarize, the five step model failed to accept the two predictions even though the predictions where true. The first prediction was not only correct, but has been able to stand the test of time of the past 42 years. The second prediction was correct, but could not be sustained. If these are typical successful predictions, then the model should be amended to the following conditions*:

1. The forecasters must have credible credentials
2. They must have current data
3. The lag between the prediction and time to occur must be short
4. Is the forecast base on proven theorems
5. How sound are the methods
6. Does the forecaster have a proven track record

*Note: A larger sampling would be required to prove or disprove the model with any level of confidence.


[1] Richard A. Ippolito, “Efficiency with Costly Informational A Study of Mutual Funds in the Period 145-1964,” Journal of Finance 23(2) (1968): 389.

[2] William A. Sherden, “The Fortune Sellers: The Big Business of Buying and Selling Predictions” John Wiley & Sons, (1998)

[3] InfoWorld CTO Forum, San Francisco, (2002), http://www.intel.com/pressroom/archive/speeches/gelsinger20020409.htm

Sunday, September 9, 2007

Web 2.0 Google Docs and Spreadsheets

Attached is the transcript of my recent gabcast on Google Docs and Spreadsheets.



Hello My name is Mark Gerschefske and I’m a doctoral student at Colorado Technical University. I’m presently studying Socio Technical Futuring and reviewing the innovations that have been made with Web 2.0.

I would like to welcome everyone to my gabcast on Google’s Docs and spreadsheet. This discussion will provide a brief over view of the collaborative environment for documents and spreadsheets as provided by Google

This gabcast will be structured into three parts…Part one will discuss Google Docs … Part two will talk to Google Spread sheets… the part three being a general summation of both Google tools.


To start with I’ll discuss the ability of Docs …I imported some sample text then started to tested the features in the tool bar. All the features such as fonts, colored text worked well . Different style are supported along with comments. One nice feature was a revision tab which shows what changes have been and by who. The only negative comment that I would make was the editing features are limited to more common ones. Such advanced features such as search and replace are not supported. However it does support saving the text in Word, Open Office and standard formats such as PDF, RFT, HTML etc.

It allows you insert objects such as images, Links, comments, tables, bookmarks, and special characters. When you select special characters it brings up a table showing special characters, Asian character, wingdings, and advanced. The advanced tab allows for you to specify the Unicode and also preview it before inserting into the document. Very nice feature…

It also includes a HTML editor which makes it a nice round package for the right price…free


To change form Google docs to Google spreadsheets you must click on DOC Homes in the upper right hand corner of the screen. This takes you to your Google DOC home page that shows all your folders. From there you click on NEW in the tool bar and select spreadsheet from the drop down menu,

I started by copying a spread sheet and pasting it into Google. The first thing that I noticed was all the highlighting disappeared along with the charts. Next I changed some of the numerical fields to see if the formulas still worked. Nothing!! A little disappointing that the first two tasks failed….So I selected the Formulas Tab to see if the where copied over…nothing there. So that explained why it could recalculate the spreadsheet. Next test was to see if I could chart the data that was copied over. That seem to work fairly well except for it didn’t support logarithmic scaling, so if there is a wide delta between the data points it will appear off the scale. This could be considered an advance function so I can understand why it wasn’t supported.

Other features such as add another sheet or duplicate a work sheet preformed well. I decided to compose a simple spread sheet to see how easy it was to use. I enter the year 2000 and clicked on the lower right corner while press CRTL and dragged across the screen expecting to inclement by one. Instead I got 2003, 2002, 2004, 2001, and 2005. What I expected to see 2000 incremented by on ending in 2005. Maybe a new feature HMMM. When I tried the same task starting from 1 and incrementing by ten… I got 1,7,6,8,9,10,11,12,13,and 14. Once it got to 8 it worked as expected other than skipping 2-5. Well done Google

To summarize…Google Docs and Spreadsheets are not bad as entry level collaborator tools. Both of them suffered from a lack of features that we have grown to expect when compared to Microsoft Office. I understand this may be an unfair comparison, but it is the bench mark everyone is compared to. When considered that these are free tools that are Web based, they are not bad for a first start. I did appear to find a bug in Google’s spreadsheet in the way that it increments numbers. I would have expected them to have caught such a simple error in their testing prior to it being released. I guess that Microsoft is not the only company that has problems with quality control. Another distraction was the time lag due to it being Web based. I have a 1 ½ meg down Internet connection, but sometimes it took several seconds for the changes to appear on the screen. As more features are added I see Google Docs and Spreadsheets becoming popular with students and home users who don’t want to invest in expensive software.

This is Mark Gerschefske signing off and hopping that you found my gabcast on Google Docs and Spreadsheets informative. Thank You.

Monday, September 3, 2007

Web 2.0 – Second Generation Web

I have posted several discussions on what Web 2.0 is. Today I came across an article in the July August 2007 IEEE TI PRO, titled “Understanding Web 2.0” which I would like to share. In my previous posting I have discussed some of the directions of Web 2.0 and how services have evolved from what is called Web 1.0. As an example screen scraping technologies have evolved into Web services and how workers are driving the trend. Web 2.0 is the result of social trends that has enabled visionary entrepreneurs to launch such sites as YouTube, MySpace, and facebook to mention only a few. What I have failed to mention is what companies are doing to ensure that they are not left behind in this fast moving Web 2.0 wave.

In a recent survey conducted by McKinsey Quarterly, 80% of executives contacted said, that their company was investing in Web services. They are using Web 2.0 to bring technology into their business to provide better communications to their customers and business partners. To improve efficiency’s with external entities, companies are using peer to peer networking, blogs, and RSS feeds. Just as Web 2.0 is improving external communications, over half the companies surveyed stated that Web 2.0 tools were being used to improve the way internal knowledge is distributed and managed. Web 2.0 services are becoming strategic to companies on how they plan to grow their business. What is interesting a lot of these Web 2.0 initiatives are started by small groups or individuals as previously stated. Due to the easy of use these tools can be quickly prototyped to support customers and are ad hoc by nature. This allows companies to be very responsive to customers and their feedback.

Of the three technologies that companies were the most interested are, RSS feeds, blogs, and wikis. Although, these were not the most often Web 2.0 tools that have been deployed. This shows that companies are excited about collaborative communication technologies and see this direction has having the biggest impact on how they will do business in the future.

Mashups are aggregation of content from multiple sources on a single Web page or Web site. They are usually comprised of APIs that allow interconnection into wide array of programs. One example is the integration of online maps with business searches to show the user where a business is located. The use of aggregation adds value and improves the users Web experience.

The proliferation of Web 2.0 is causing companies to rethink how they need to business to remain competitive. The use of collaborative technologies promises to improve efficiency’s within companies and how the interact with their business partners. With the expanding of global markets the need to collaborate across virtual boundaries will be the key to success.

Murugesan, San, IEEE IT Pro, Understanding Web 2.0, July/August 2007.

The McKinsey Quarterly, How businesses are using Web 2.0: A McKinsey Global Survey, March 2007 http://www.mckinseyquarterly.com/article_page.aspx?ar=1913&l2=13&l3=11&srid=9&gp=1


Wednesday, August 29, 2007

Web 2.0 Tools – Google Docs

This week I decided to try Google Docs for a collaboration project to write a Computer Security Policy for our church. There are three of us on the committee and it seemed like a good idea to jointly work on the documents together.

The first step was to upload the files to my Google Docs folder. This seemed like a simple enough task. It turns out that there where 11 files for this project with each file having to be uploaded separately. Not believing that Google would make you do this, I kept trying to select multiple files, but the interface to Google Docs would not allow it. So after uploading 12 files (lost track and ended uploading one file twice) I was able to copy them into a separate folder. Again, only one file at a time. At this point I was starting to get a little impatient, but I continued. The next task was to share these files. So logically I though that I could just share the entire folder. Again I was wrong, each file must be shared individually, a pattern was beginning to form. This required that I email an invitation to each member of the group for each file. I think that you are starting to get the picture, 11 files meant 11 email invites. After about the fourth invite I decided I better see how well the collaboration tools actual worked before I spammed everyone’s mailbox.

So the next task was to actually edit one of the documents. Since these where boiler plated security policy documents form the SANS Website, I need to do a global find and replace. I quickly learned that this not a supported feature. So after manually cutting and pasting I notice the font type and size where different. After spending 10 minutes trying to get the font right, I decided just to cut and past into MS Word. Even this operation was less than perfect resulting in some of the formatting being lost.

So after spending the best part of an hour, I have to conclude the experiment was less then a success. The concept is great, just that the implementation is less than perfect. It may work well to act as a central repository, but the features of the word processor are inadequate to allow productive work. I would have to give Google Docs a C-. One could say what do your expect for free.

Sunday, August 26, 2007

More on Web 2.0...

There has been much discussion on what is Web 2.0? After nearly 1 ½ years there is still substantial disagreement on what Web 2.0 really means. Nearly everyone will agree that Web 2.0 is social network sites, Wikipedia, Blogs, and RSS feeds. Some will even go as far to say that many of these services were included in Web 1.0 and that Web 2.0 is nothing but, a marketing play.

Tim O’Reilly on his Website in an attempt to clarify the meaning of Web 2.0 listed the following comparisons.

Web 1.0 Web 2.0
DoubleClick --> Google AdSense
Ofoto --> Flickr
Akamai --> BitTorrent
mp3.com --> Napster
Britannica Online --> Wikipedia
personal websites --> blogging
evite --> upcoming.org and EVDB
domain name speculation --> search engine optimization
page views --> cost per click
screen scraping --> web services
publishing --> participation
content management systems --> wikis
directories (taxonomy) --> tagging ("folksonomy")
stickiness --> syndication

When thinking of the progression from Web 1.0 to 2.0 one of the first examples is that of Netscape and Google. Netscape is thought to be the standard bearer of Web 1.0 in as much that its IPO was one of the defining events for that era. The same can also be said, for Google and Web 2.0. Netscape framed the Web 1.0 as being part of old software paradigm relying heavily on Web based browser which was packaged as a desktop application. Google on the other hand has always been a native Web application which delivered services, not packaged or sold. Google and other Web 2.0 applications often took a radical approach in their design. BitTorrent made every client a server. Wekipedia was based on the concept that any user could make an entry and that any entry could be edited by any user.

The power of Web 2.0 is the ability to extend useful information to a user from the collection of terabytes of available data. Google does this through its extensive search language which allows a user to navigate through all the data on the Internet to target a specific piece of information residing on an unknown server somewhere. This power has both a good and dark side which will have to be brought under control. This same technology can be used to find credit card numbers, personal SSNs, along with system administration passwords. A term has even been coined for this activity, Google Hacking.

To summarize some of the core competencies of Web 2.0 companies can be thought of as:
Services with ease of scalability
Services that gets richer as more people use them
Services based on collective intelligence