We’ve traveled many roads together, my friends, but I take my leave of you now to commit seppuku in the proud warrior tradition. Perhaps you’ll see an iPhone in a storefront sometime and think of your old pal AP, hm?

Online video leader YouTube on Monday rolled out long-awaited technology to automatically remove copyrighted clips, hoping to placate movie and television studios fed up with the Web site’s persistent piracy problems.

The filtering tools are designed so the owners of copyrighted video can block their material from appearing on YouTube, which has become a pop culture phenomenon in its 2-year existence. The tools also give the owners of copyrighted video the option to sell ads around their material if they want the clips to remain available on YouTube.

They had no choice. They’re looking at a $1 billion lawsuit from Viacom and the only way they can get to the statutory safe harbor protecting them from liability is to take proactive steps to “accommodate … standard technical measures used to identify and protect copyrighted works.” They’ve done that now…

…or have they? It might not be time to plunge that dagger just yet:

Louis Solomon, a lawyer representing and English soccer league and music publisher Bourne Co. in another copyright infringement case against YouTube, criticized the new filtering system as “wholly inadequate.

“It does nothing about the past and won’t be enough to protect the future,” Solomon said.

YouTube now needs the cooperation of copyright owners for its filtering system to work because the technology requires copyright holders to provide copies of the video they want to protect so YouTube can compare those digital files to material being uploaded to its Web site.

This means that movie and TV studios will have to provide decades of copyright material if they don’t want it to appear on YouTube or spend even more time scanning the site for violations.

The logical solution here would be to have content providers embed some sort of digital signature in their videos that YouTube’s software would scan for. If the signature is detected, it refuses to upload the video. Hackers will crack that code in two minutes, but cracking the code would be ipso facto proof of willful infringement, which carries with it draconian damages. Make an example of a few hackers in court and they’ll straighten up pretty quick. Meanwhile, I love the idea mentioned up top of allowing the videos to be posted but affixing ads to them that would generate revenue for the content provider. That seems a smart compromise solution going forward into the digital age as it would leave information more or less free while making it worth the creator’s while to keep producing it.

We’ll see how it goes this month.