Friday, November 29, 2013

Thrun shifts Udacity toward lifelong vocational education

You will want to read this article on Udacity's Sebastian Thrun, the man who popularized MOOCs with his 2011 artifical intelligence class.

He has abandoned the goal of bringing conventional college courses to low performing students in developed and developing countries, and pivoted toward vocational education.

The switch was motivated by the poor results of San Jose State University students who used Udacity material rather than conventional textbooks in "flipped" classrooms. The results left Thrun dissilusioned -- "I'd aspired to give people a profound education--to teach them something substantial, but the data was at odds with this idea."

Thrun himself prepared the material for an introductory statistics course to be used at San Jose State. He did his best, stating that "From a pedagogical perspective, it was the best I could have done -- It was a good class."

But the students did poorly and Thrun concluded that they had a "lousy product."

I think Thrun's elite background led him down a garden path. Any San Jose State professor who had taught an introduction to statistics could have told him that many (most?) of his students would not have basic arithmetic skills and would "hate math." They are not Stanford students.

His response to this experience is to focus on industrial education, stating that "At the end of the day, the true value proposition of education is employment." As such they are developing an AT&T-sponsored masters degree at Georgia Tech and training material for developers.

Thrun's revelation has produced a sense of schadenfreude among some academics, who were happy to say "we told you so" and perhaps breathe a sigh of relief that their jobs will remain secure.

But, even if many college courses are best taught in a conventional classroom or in a flipped class with a conventional textbook, those jobs may still be in danger. If Thrun is right about industrial education -- jobs -- being the true value of education, we may see more and more students foregoing college altogether for offerings like Thrun's.

Those San Jose students want jobs and they are increasingly aware of the rising direct cost and opportunity cost of a college degree.

When speaking of his five-year old son, Thrun reveals his vision of the future of education, saying "I hope he can hit the workforce relatively early and engage in lifelong education -- I wish to do away with the idea of spending one big chunk of time learning."

Udacity has already moved into workforce training, and I think they have been working on lifelong learning all along.
-----

Update 11/30/2013
There is a long discussion of this post on Slashdot.
-----
Update 3/25/2014

As Udacity moves away from university education, Coursera hires a university administrator as CEO -- Richard C. Levin, who was president of Yale University for 20 years. No one seems to have figured out the university-course business model, but Levin's international experience indicates that he sees a global market -- for students and teachers.

Sunday, November 17, 2013

Wolfram Language -- a paradigm shift?

As technology evolves, working with new data types becomes economically feasible. The first computers worked on numbers, they "computed." Then came alphanumeric data, text, speech, music, etc. This table shows the decade in which each data type became widely available:


Wolfram Language sounds like it might change this paradigm. This year marks the 25th anniversary of Stephen Wolfram's symbolic math program Mathematica. While Wolfram has introduced several "products" over the years, his has been a 25 year research and development project on generalized symbolic programming -- developing what he calls Wolfram Language.

Here is how he describes it:
There’s a fundamental idea that’s at the foundation of the Wolfram Language: the idea of symbolic programming, and the idea of representing everything as a symbolic expression. It’s been an embarrassingly gradual process over the course of decades for me to understand just how powerful this idea is. That there’s a completely general and uniform way to represent things, and that at every level that representation is immediately and fluidly accessible to computation.

It can be an array of data. Or a piece of graphics. Or an algebraic formula. Or a network. Or a time series. Or a geographic location. Or a user interface. Or a document. Or a piece of code. All of these are just symbolic expressions which can be combined or manipulated in a very uniform way.

And here is how he visualizes it:

It sounds like a uniform way to represent and operate on everything from text strings to user interfaces.

Wolfram's blog post is vague but enticing. He promises to make Wolfram Language freely available in the cloud so we will be able to see it for ourselves. If this blog post had been written by someone at a Silicon Valley startup, I would have dismissed it as pre-marketing hype. But, it wasn't. Instead, it reminded me of of the "Preliminary discussion" of the von Neumann architecture I read as a student many years ago.

Jon von Neumann beside the IAS computer at Princeton
-----




Update 11/18/2013

John Graves commented on the natural language processing promises in Wolfram's post.
-----

Update 11/22/2013

Wolfram has posted a Language reference manual on their site. It primarily a reference manual, but there are some examples and tutorials.

-----

Update 12/8/2013

Stephen Wolfram gave the keynote talk at The Next Web in Amsterdam (video below).  He mentioned Mathematica, but spent most of the time on Wolfram Alpha, Wolfram Language, complexity arising from simple procedures and data analysis and visualization.

He put the language in context by saying that "Wolfram alpha is basically 15 million lines of Wolfram Language code plus some number of terabytes of raw data plus a whole collection of real time feeds."  The description and demo of Wolfram Language begins around the 6m 50s point in his talk.

Wolfram talks fast and is somewhat elliptical, but he peppers his presentation with demos.  One might be tempted to write him off as hype, but the demos are real. The most impressive was this one liner:
dynamic[EdgeDetect[CurrentImage[]]]
which produced real time edge detection as he moved his hand in front of a camera. (It's at the 10m 10s point of the video). Impressive as that was, one has to wonder how brittle the system is. Wolfram knew the language included an edge detection algorithm, but can it be used for a variety of problems and applications? I cannot figure out a query that will get Wolfram Alpha to tell me the number of LTE mobile subscribers -- that is a bad sign.

Here's Wolfram's keynote:

Friday, November 15, 2013

Crowdsourcing college rating and innovation -- President Obama's affordable college initiative

Department of Education officials, led by Under Secretary of Education Martha Kanter, were on our campus last week, soliciting input on the president’s College Value and Affordability plan.

There were open forums and a lunch with several faculty and administrators. I was not able to attend the open forums, but was at the lunch.

The Los Angeles Times covered the forums, reporting that the plan met with skepticism about the feasibility of developing an effective rating system. Politico's coverage of the event also focused on the difficulties.

The lunch meeting was similar. Ms. Kanter described their plan and asked for input. While we all favor increased market transparency to help students pick a college, doing so is difficult. Many of the comments were warnings about shortcomings -- unanticipated side-effects of any rating system, differences in student preparation in high school, differences in levels of State support for schools, etc.

A multivariate problem like selecting a college or comparing two colleges does not have a single solution. It is like the proverbial blind men exploring an elephant -- complex and subjective.

But, that does not mean there is nothing for government to do. The role of the government in this case is to gather and publish data, not to analyze it. (Or to be only one of many analysts).

The government should publish data on colleges and outcomes in an open, easily manipulated format. Let the people -- policy experts, teachers, administrators, prospective students and their families -- anyone with an interest -- do their own data analysis and draw their own conclusions. The system should also provide a platform for presenting, sharing and discussing those analyses and conclusions.

Tools for analyzing and visualizing the data should also be provided and it should be possible for a person to document their analysis using these tools in such a manner that others could replicate it, re-run it over time, or modify it. The analysis and visualization toolkit should also be open and, of course, users should be able to analyze and visualize the data using their own tools and methods.

In short, government should publish accurate, open data and let a thousand eyes see it.

The President's plan also addresses educational quality. Their Fact Sheet on President Obama’s Plan to Make College More Affordable, discusses technology-enabled innovation, singling out several examples at elite universities and Coursera.

Coursera and elite universities are developing valuable technology and pedagogy, but, just as I would open the data analysis process, I would have the government promote open innovation in education. There are gifted teachers and innovators in schools, colleges, high schools, design firms, etc. throughout the world. Furthermore, faculty in these schools often have more experience with the sorts of students who cannot afford education today than do those at elite universities.

Can we find ways to include the contributions of those people in the President's program?

For example, MIT, Stanford and Google are collaborating on an open source platform for MOOCs, (MOOC.org). An open, hosted version of that platform would be a valuable resource for faculty and others around the world. Google has the infrastructure – not just connectivity, but services like Google Plus Communities, Hangouts, YouTube and Google Drive – to create that hosted platform. It would be terrific if the Department of Education could work with them to create an open "YouTube for education". (If Google were not willing, the CSU system or others could do it).

There are also many examples of the use of fine-grained, focused modules in education -- most prominently at the Khan Academy. Modular material can take many forms – a short PowerPoint presentation or video demonstration, a thought provoking quote, an anecdote, an image, an animation, a question, an assignment, a simulation, a grading technique, etc. can be effective and of use to others. Every teacher has some such "modules" that could be of use to others. The trick is to make them discoverable. Could the Department of Education create a system for doing that?

High school teachers and others working on Common Core curricula are also a resource for colleges. Dominguez Hills and others spend a lot of time on remediation. We should encourage colleges to discover and incorporate teaching material and insights from those working on the Common Core.

Let a thousand flowers bloom.

-----

Update 11/19/2013


-----

Update 11/25/2013 This article on the forum appeared in our campus publication, Inside Dominguez. It contains several student and faculty comments. -----

Update 11/25/2013

There is a fairly long discussion of this post On Slashdot.

Wednesday, November 13, 2013

Los Angeles to request city-wide fiber proposals

Ars Technica reports that the City of Los Angeles plans to issue a request for proposals (RFP) to bring fiber connectivity to every resident. The RFP is not out yet, but it sounds as though they are not looking for a retail Interent service provider, but for open infrastructure which competitors could use to offer Internet service.

The article is vague, but it seems the city envisions an open network with wholesale pricing for any one who wants to compete as a retail Internet service provider. That is reminiscent of the successful approach taken in Stockholm.

But it is also reminiscent of the thwarted desire of Congress in passing the Telecommunication Act of 1996 in the United States. The incumbent telephone and cable companies used the courts, lobbying and claims of limited facilites to kill the would-be competion.

The rumored RFP has characteristics of Google Fiber, like free or ad supported low speed connectivity for all, tiered pricing for high speed Internet, television and telephone and free or subsidized connectivity to non-profits. On the other hand, Los Angeles is said to seek business access, which Google does not allow.

Speaking of Google Fiber, they may very well have plans to go nation wide -- might they be a bidder in Los Angeles?

The article says the RFP has the support of recently elected council member Bob Blumenfield and new mayor Eric Garcetti.

I live in Los Angeles, and long ago gave up hope that Verizon would deliver their fiber service, FIOS, to my neighborhood, so am stuck with only one viable Internet service provider. (Yes, it's a monopoly). I would love to see something come of this, but seeing is believing.

-----
Update 7/22/2014

The city has received 34 responses to its request for information (RFI). They came from city departments as well as private companies, including my current monopoly ISP Time-Warner Cable and an optimistic report from Angie Communications, a Dutch company. The city will now take these RFIs into account in drafting a request for proposals.