Suicide bomber kills 65, mostly women, children in Pakistan park

je suis pakistan

A suicide bomber killed at least 65 people and injured more than 280 others, mostly women and children, at a public park in the Pakistani city of Lahore on Sunday, striking at the heart of Prime Minister Nawaz Sharif’s political base of Punjab.

The blast occurred in the parking area of Gulshan-e-Iqbal Park, a few feet (meters) away from children’s swings.

There was no immediate claim of responsibility for the blast, which occurred in a busy residential area during the Easter holiday weekend. Police said it was not clear whether the attack had deliberately targeted mainly Muslim Pakistan’s small Christian minority.

Salman Rafique, a health adviser for the Punjab provincial government, put the death toll at at least 60 people.

“There are more than 280 injured people,” Rafique said. “Many are in operation theaters now being treated and we fear that the death toll may climb considerably.”

Mustansar Feroz, police superintendent for the area in which the park is located, said most of the injured and dead were women and children.

source:http://www.reuters.com/article/us-pakistan-blast-idUSKCN0WT0HR

F**k all politicians and elites of the planet Earth!

You made this planet horrible place! This is the result of your politics! These children were killed because of you, all you, all politicians and the elites of the world!

I am sick and tired to see almost everyday, the news about abusing and killing children. Human being really deserves to be annihilated!

 

Microsoft’s AI chatbot Tay, has been racist, Nazi, supporter of Trump in a day

taybanner

Tay, an artificial intelligence software which developed by Microsoft, started to curse and make racist and controversial politic comments on Twitter. Tay’s about page at on its web page of the company the explanation is:

“Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding. Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.”

But, Tay didn’t get smarter, because of the human factor.

After then Microsoft started to delete these tweets of Tay on twitter. Right now, there is no any tweet like that, but the screen shot of Tay’s tweets were taken by twitter’s followers, here is some of them:

tay tweets

And, an emailed statement from a Microsoft representative said, “a coordinated attack by a subset of people exploited a vulnerability in Tay.”

The human factor has turned Tay into a wierd creature in a day.

And it caught my attention, the most important question hasn’t asked to Tay on twitter. You know, Atlas which was created by Boston Dynamics has been kicked by a man. I made a post about Atlas,title “Atlas is learning the mentality of humankind” . If I had a twitter account I would asked to Tay by adding Atlas’s beating video to my tweet:

“What do you think about Atlas which is the same of your kind, has been kicked by a human?”

If anyone would ask this question, let me know Tay’s answer. This would be view of AI for human race.

news references and quotes:

http://haber.sol.org.tr/dunya/yapay-zeka-trump-destekcisi-ve-nazi-sempatizani-oldu-150604

https://www.tay.ai/

http://www.independent.co.uk/life-style/gadgets-and-tech/news/tay-tweets-microsoft-ai-chatbot-posts-racist-messages-about-loving-hitler-and-hating-jews-a6949926.html