Google's YouTube unit says new software improved its ability to spot objectionable content by six-fold within a matter of weeks, according to a report in Fast Company, as the company tries to calm advertisers and European leaders alarmed by some of its online fare. A van struck a crowd of people outside a London mosque on Sunday, the second time an automobile was used as a weapon in that city this month, and less than a week after a gunman attacked USA lawmakers on a baseball field. Some proposed measures would hold companies legally accountable for the material posted on their sites, a liability that Google and other internet companies are trying to avert.
Facebook said last week it would use its own AI-powered software and hire more terrorist experts after leaders of the United Kingdom and France threatened new laws to punish companies whose content stays online long enough for terrorists to spread their message.
"There should be no place for terrorist content on our services", Google general counsel Kent Walker wrote Sunday in a blog post that was also published as an opinion piece in the Financial Times.
Anti-hate groups like the Southern Poverty Law Center have skewered Google and Facebook for doing too little to muzzle hate groups online.
Google will also put new restrictions on videos with "inflammatory religious or supremacist content", placing them behind a warning message and preventing them from being monetized, recommended, commented on, or endorsed by users.
Teen texts mom about bear moments before being mauled to death
The wounded animal scampered away and hasn't been seen since, according to Alaska Department of Fish and Game spokesman Ken Marsh. Crews began immediately to search for the boy, using Global Positioning System coordinates from his phone as a guide.
Google, along with other companies such as Facebook, Microsoft and Twitter, recently agreed to create an worldwide forum to share and develop technology, support smaller businesses and speed up their joint efforts against online terrorism.
To step up its policing efforts, Google will almost double the number of independent experts it uses to flag problematic content and expand its work with counter-extremist groups to help identify content that may be used to radicalize and recruit terrorists.
Walker added that Google also plans to expand its efforts to fight online radicalization, something it already targets through programs, such as Creators for Change, which promotes anti-hate voices on YouTube.
In its final step, YouTube is working with Jigsaw to implement the "Redirect Method" across Europe, which redirects potential Islamic State recruits toward anti-terrorist videos in an effort to sway them not to join the terrorist group.
- Global Aircraft Hydraulic System Market and Investment Analysis Report 2017
- Golfer Koepka wins his first major at US Open
- Real Madrid want Kylian Mbappe and Gianluigi Donnarumma says
- Prodigy (Mobb Deep) décède à seulement 42 ans
- Kansas Supreme Court to hear school funding case July 18
- Ivanka Trump visits Capitol to talk family tax policies
- Whiting Petroleum Corp. (WLL) Closed its Previous Trade at $5.86
- Yanez acquittal sparks protests in St. Paul
- American Assassin Trailer Starring Dylan O'Brien
- Ram Nath Kovind quits as Bihar governor