Categories
tech

Mobile Linux Gears Up

As I recently posted, handheld computing is set to take a big step forward, and with the hardware finally becoming suitable, there is a big question: proprietary or open software? Ubuntu is gearing up to make that a real choice [BBC].

As this post on the Ubuntu listserv explains,

it is clear that new types of device – small, handheld, graphical tablets which are Internet-enabled are going to change the way we communicate and collaborate. These devices place new demands on open source software and require innovative graphical interfaces, improved power management and better responsiveness.

Intel, specifically, have announced a new low-power processor and chipset architecture which will be designed to allow full internet use on these mobile Internet devices.

Instead of limited-function services like web browsing over my cell phone — which is so expensive and clumsy that I never use it — we will have general-purpose and freely expandable computing in our hands. This is going to be big.

Categories
tech

Project Honey Pot: New Tools

Project Honey Pot captures the addresses of spam bots it attracts, and allows site administrators to block those addresses. Recently, they have introduced some cool new tools.

I just installed the WordPress plugin for Http:BL, which makes it really easy to tap into Project Honey Pot’s ban list to keep the bad bots off this site. You can also easily create your own honey pot and turn in offenders.

I’ve been using Akismet to filter spam comments (thanks for the recommendation, Emily S.!), which is why you don’t see hundreds of ads for drugs and porn in the comments on this site. It catches everything and hasn’t falsely caught any legitimate comments yet. But blocking them here is only defensive — I’m glad to be able to help crack down on them overall.

Categories
tech

Refining Google

Via Digg, I found an interesting article on Google’s attempts to prevent people from “gaming” its search results. Google’s PageRank algorithm, while secret, is known to consider the number and quality of incoming links to a site in its rankings. Therefore PageRank has working models of reputation, trust, etc.

In the article, Carsten Cumbrowski talks a lot of jargon and the writing becomes elliptical and dense at times, but the information he presents, and links to, comprises a very good background on issues with PageRank. He analyzes the NOFOLLOW attribute, an attempt to reduce the credence given to paid or otherwise less meaningful links. He also covers improvements to PageRank’s trust model:

It is like with people. You do not trust anybody you just have met. How quickly you trust somebody is less a time factor, but has to do with what the person is doing or is not doing and how much it does match what the person says about himself, his intentions and his plans.

Therefore the age of a site is a poor proxy for trustworthiness, and PageRank’s naive reliance on it was faulty. As I’ve posted before, an extreme amount of time and effort goes into reverse-engineering search algorithms, along a whole spectrum from benign “search engine optimization” to malicious exploitation of flaws. It’s an arms race in which the complexity of the system is determined as much by competitive pressure from its exploiters as by the desire for more useful search results.

Remember that the next time you rely on a search algorithm — or build a web service that relies on one.

Categories
tech

Digg Gets Caught in HD-DVD Encryption Fight

As I posted a while back, a method was found for extracting the encryption codes for HD-DVDs and Blu-Ray discs, allowing unauthorized decryption. Basically the problem is that in order to let legitimate users play their movies, you have to give them both the locked version and the key. It’s just a matter of time before someone takes that key and unlocks something else. In the case of HD-DVDs, it’s even worse because the encryption scheme depends on a single master key.

Next came some twists that speak volumes about the current state of “intellectual property” and its radical opposition to free speech. Yesterday the story went around social bookmarking and discussion site Digg that the key had been found. Because the master key was so short, the original poster included it in the title of their post.

Digg received a DMCA takedown notice and decided to comply. Users went nuts. They flooded the site with posts containing the key code and lobbying Digg’s management to fight back. Finally, at the end of the day, founder Kevin Rose posted his decision.

First, this case highlights the fact that even if you have a good method of securing information, there is no reasonable level of lockdown that will prevent this type of “leak”. This is a tangent, but I think an enlightening one: the same thing applies to security from terrorism.

There is no such thing as a tradeoff of freedom for security, because security is an illusion. Inmates in maximum-security prisons still manage to murder each other. Unless you’re willing to impose restrictions on the public greater than those on prisoners, you can’t make violence impossible. If violence is possible, you can’t be secure — only more or less likely to be attacked, more or less likely to live through it.

Any amount of freedom in a society, which I hope we can agree is a good thing, brings with it “security holes”. The fact that we are not constantly at war with each other is a social construct based on alternative effective methods of conflict resolution. We are free to be violent, but mostly choose not to.

Back to freedom of information. Freedom of speech does not exist in a vacuum. Any amount of freedom of expression in a society depends upon the public’s ability to use and repurpose the expressions of others. No reasonable amount of restriction of free expression will achieve full control over access to information. And the crucial point is that with the internet, one lapse in access control leads to full publicity. Therefore the idea of trading freedom of expression for information security, in the public context, is an illusion.

Second, Digg’s reaction is a miniature version of the process that society as a whole is going through to readjust its beliefs and policies to internet technology. Because of the internet, that encryption key is now a piece of public information. And the public is getting tired of corporate interests manipulating the legal system, trying to put the cat back in the bag.

Digg recognizes that its value is dependent upon its social nature. Placing restrictions on users will dry up that well of participation and cause Digg to fail. It’s precisely the many-to-many nature of the internet that makes explicit the radical dependence of content and service providers on the good will of their users.

Without waiting for the market to sort out the problem, Digg listened directly to its users and made the right decision. In a sense, actually, the users made the decision. That’s a fundamental shift in the way things work, which is making its way through every institution — though mostly at a slower pace than Digg’s one-day turnaround.

Categories
tech

Joost Readies Launch

Joost is a p2p streaming video system. Basically you watch TV on your computer; I’ve been using the beta and it works really well. The limitations are 1. you can’t save the videos, only stream them, and 2. your favorite show is probably not on Joost (yet).

Ars Technica reports that Joost will come out of beta later this month with content from CNN and other Turner networks, in addition to existing deals with CBS and Viacom.

While other net video services like YouTube and Google Video have been hugely successful with short clips, TV networks want more control and more DRM. I’m sure they’re also happy to save on high-bandwidth streaming servers and pass that task to users’ network connections. Joost is a few steps ahead and seems well-positioned to give the networks what they want.

I do think it’s cool that the old media are trying out new distribution methods. But they seem to be sacrificing some key features of web services — searching, repurposing, linking, and layering content via standard, open protocols and APIs. They’ve rebuilt the one-to-many TV network model on top of the many-to-many internet. This may be necessary to secure participation of the old media players but let’s not stop demanding full functionality — which means open interoperability — of these new services.

Categories
tech

Deep Notes

Free software plug: I really like Deep Notes, a simple outlining app. When working on longer writing, I need to see the structure in outline form while being able to quickly dig down into that structure and look at the exact wording.

Deep Notes represents hierarchical structure with expand/condense buttons, like list view in the OS X Finder, giving complete control over the level of detail you see for every item. It’s easy to move items around in the hierarchy; it’s a dynamic model, which is exactly what I need when sorting out conceptual structures.

If anyone has seen this functionality in a web app, please let me know!

Categories
tech

Koha Library System

The SLC library currently uses proprietary catalog software. It’s expensive, we can’t add features we want, and it won’t interoperate with our other systems like web servers, image databases, and our learning management system (which is a whole other problem in itself). So everyone was pleased when the opportunity arose to consider a different solution: Koha. It’s an open source integrated library system.

The bad news is that it’s still an immature product and lacks some features we would need, like a reserves module. The good news is that some of the developers close to the project have started a service company, LibLime, which will develop features and customizations and add them to the software. Rather than paying a software license fee to the proprietary vendor, who has little incentive to implement our feature requests, we could directly pay the developers to build the software we want.

LibLime’s approach is to treat customizations as preferences — switches that can be flipped to give different functionality from the same build of the software. This prevents forking and versioning issues, which were my key concerns with mission-critical open source software. The developers themselves take an integrative approach; they seem very interested in developing an extensive feature list in response to what librarians need and dealing with any conflicts at the preference level.

Often with proprietary software, one preference is forced on all users because that is less work for the developers. To the contrary, the paid-development/open-source model means that the developers get paid for exactly how much work they do, so they can afford to do things the hard way if that’s what users want.

Down the road, I’m concerned with making sure that the systems we implement are standards-compliant and talk to each other. The possibility of tying together a catalog/search solution like Koha with a web platform like Plone, another open source software, really raises the prospect of free and easy information flow around campus. The open source model means that these tools keep getting better and more available; what starts in the library and expands to the campus continues to spread across the entire internet.

Categories
tech

Blog Archiving Survey

Jessamyn West posts about a survey put out by UNC-Chapel Hill’s School of Info & Library Science. They’re gathering data about bloggers’ habits and perceptions, with an eye to preserving blog content permanently.

It’s an interesting question — many blogs certainly fall into the category of journalism, and would be as useful as newspaper archives for historians and researchers. On the other hand, many bloggers post casually and treat the medium as ephemeral. Twitter takes that approach to the extreme.

In any case, I’m curious to see what the UNC folks make of the results.

Categories
tech

Linux Proliferation

Linux is free. It does pretty much everything Windows does. So why hasn’t Linux taken over as the most popular desktop OS? Open Source Learning’s Jan Stedehouder answers that it’s a combination of factors but mostly inertia and politics.

This analysis is pretty straightforward; the new idea is to explicitly create an international working group with the goal of Linux proliferation. I happen to think that the consumer market is not the best solution for every problem. This is a case where it has really failed to pick the optimal solution and therefore intervention is a good idea.

On the other hand, open source software development is really a peculiar form of activism; in addition to those who identify as “freedom fighters”, it appeals to apolitical engineers and political bystanders who just want to see the best tools made available. Will their commitment to these tools extend into the explicitly political sphere?

Categories
tech

New Handhelds: Open vs. Proprietary

At this year’s MacWorld, Apple didn’t announce any new computer products — unless you count the iPhone. In fact, Apple has delayed the release of its latest OS update, “Leopard”, until October in order to devote resources to the iPhone. It’s more than a phone — it’s a general-purpose handheld computer. To my eyes, the phone function is a bonus.

As usual, Apple is ahead of the curve. Yesterday, Intel announced its partnership with several other hardware manufacturers to produce a $500 handheld. It will run Linux as well as Windows Vista, which actually makes it a more open platform than the iPhone, for which Apple has announced no plans to release the API.

Since the market failure of the Apple Newton, we’ve been stuck in a world of low-powered, non-standard-OS-running, clunky-interface PDAs. With the huge popularity of “smart phones”, there’s a new willingness to pursue the handheld format. Display technology has also come a long way. And I, for one, have been literally waiting since the Newton.

A few years ago, Duke U. gave out iPods to its entire freshman class. For this year’s freshmen, that’s probably redundant since iPods are more popular than beer. All kidding aside, as popular as laptops are, they don’t get carried everywhere. Cellphones do, but they are locked down and designed so as not to function as general-purpose computers. Handhelds offer extreme integration of computers into daily experience.

College campuses will be the laboratories of this new technology’s cultural impact. We’re not committed to productivity per se; college students will find the fun uses — as well as the innovative workflows that those “on the clock” wouldn’t think to try. One lesson of Web 2.0 is that you don’t design a social environment — you give everyone access and if the product is cool, some of the thousands or millions of users will contribute to it, leverage it, improve it, and turn it into something great.

Of course, you don’t want to compromise the original functionality by allowing remixes. The question of how open to be is very much live right now. See MySpace vs. embedded media widgets, or Alexa vs. Alexaholic. As Wired’s Eliot Van Buskirk says in the MySpace article,

Its closest competitor, Facebook, has unannounced (but confirmed) plans to open its site to third-party widgets for the first time. Ultimately, the two sites could come to resemble each other, but which will users prefer? Surely, the one that’s more open and transparent. That approach has prevailed over and over on the web.

Will the public continue to vote with their clicks for the open web model? Probably. Will software and hardware makers draw that analogy to their products? Eventually. I believe that, barring anti-competitive manipulation (e.g. misuse of copyright and patent law), the open model will prevail. But man, the iPhone looks cool…