Extremist videos can be blocked automatically

Terrorist_58349e_70892Software normally used by big content to snuffle out pirated software could soon have a purpose kicking extremist content off-line.

Apparently the software is being tested now with some success.  YouTube and Facebook are among the sites deploying systems to block or rapidly take down Islamic State videos and other similar material, the sources said.

The technology¬† looks for “hashes,” a type of unique digital fingerprint that internet companies automatically assign to specific videos, allowing all content with matching fingerprints to be removed rapidly.

Such a system would catch attempts to repost content already identified as unacceptable, but would not automatically block videos that have not been seen before.

The companies would not confirm that they are using the method or talk about how it might be employed, but numerous people familiar with the technology said that posted videos could be checked against a database of banned content to identify new postings of, say, a beheading or a lecture inciting violence.

Use of the new technology is likely to be refined over time as internet companies continue to discuss the issue internally and with competitors and other interested parties.

So far most have relied until now mainly on users to flag content that violates their terms of service, and many still do. Flagged material is then individually reviewed by human editors who delete postings found to be in violation.

The companies now using automation are not publicly discussing it, two sources said, in part out of concern that terrorists might learn how to manipulate their systems or that repressive regimes might insist the technology be used to censor opponents.