لوگ ڈرے ہوئے ہیں سب کی ویڈیو بنی ہوئی ہے، کیپٹن صفدر

ranaji

President (40k+ posts)
بقول رانا ثنا اللہ یہ بے غیرت لعنتی کیپٹن صفدر اپنی ماں جیسی ساس کے ساتھ بد کاری حرام کاری اور زنا کاری کر تا ہو اس بےغیرت اور پمپ اور اپنی ماں جیسی ساس کی ساتھ زنا کرنے والے بےغیرت کو کس بات کا ڈر یہ پمپوں کا ٹبر ہے ان کے نزدیک ایسی باتوں کا کیا ڈر ویسے بھی یہ بے غیرت کیپٹن صفدر قطریوں کا کنفرم سپلائر ہے اور جسکی کمائی ہی دلا گیری کی ہو اسکی کیا عزت اور کیا ڈر اس نے تو اپنیُ اور اپنے گھر والو ں کی ستھنُ اتار کر کاندھے پر رکھی ہوئی ہے یہ بے غیرتوں کا کیپٹن نہیں فیلڈ مارشل ہے بے غیرت اوّل آف پاکستان
 

abduloz

Politcal Worker (100+ posts)
These fucking bastard mental retarded people trying to accuse some institutions the way they are talking but people should ask these harammi filthy crook politicians
who ask these moron politicians to have sex with these women's out side of their married life So they can caught on cameras.

You know guys this country political system is rotten it should change we thought imran could bring big changes but being a very big fan of imran khan I think he is failed to change the country system and mentality of the people of this country that is very sad.
 

Dastgir khan19

Minister (2k+ posts)

A horrifying new AI app swaps women into porn videos with a click​

Deepfake researchers have long feared the day this would arrive.

Update: As of September 14, a day after this story published, Y posted a new notice saying it is now unavailable. We will continue to monitor the site for more changes.

The website is eye-catching for its simplicity. Against a white backdrop, a giant blue button invites visitors to upload a picture of a face. Below the button, four AI-generated faces allow you to test the service. Above it, the tag line boldly proclaims the purpose: turn anyone into a porn star by using deepfake technology to swap the person’s face into an adult video. All it requires is the picture and the push of a button.

MIT Technology Review has chosen not to name the service, which we will call Y, or use any direct quotes and screenshots of its contents, to avoid driving traffic to the site. It was discovered and brought to our attention by deepfake researcher Henry Ajder, who has been tracking the evolution and rise of synthetic media online.

For now, Y exists in relative obscurity, with a small user base actively giving the creator development feedback in online forums. But researchers have feared that an app like this would emerge, breaching an ethical line no other service has crossed before.

From the beginning, deepfakes, or AI-generated synthetic media, have primarily been used to create pornographic representations of women, who often find this psychologically devastating. The original Reddit creator who popularized the technology face-swapped female celebrities’ faces into porn videos. To this day, the research company Sensity AI estimates, between 90% and 95% of all online deepfake videos are nonconsensual porn, and around 90% of those feature women.
As the technology has advanced, numerous easy-to-use no-code tools have also emerged, allowing users to “strip” the clothes off female bodies in images. Many of these services have since been forced offline, but the code still exists in open-source repositories and has continued to resurface in new forms. The latest such site received over 6.7 million visits in August, according to the researcher Genevieve Oh, who discovered it. It has yet to be taken offline.
https://www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/
After years of activists fighting to protect victims of image-based sexual violence, deepfakes are finally forcing lawmakers to pay attention.
There have been other single-photo face-swapping apps, like ZAO or ReFace, that place users into selected scenes from mainstream movies or pop videos. But as the first dedicated pornographic face-swapping app, Y takes this to a new level. It’s “tailor-made” to create pornographic images of people without their consent, says Adam Dodge, the founder of EndTAB, a nonprofit that educates people about technology-enabled abuse. This makes it easier for the creators to improve the technology for this specific use case and entices people who otherwise wouldn’t have thought about creating deepfake porn. “Anytime you specialize like that, it creates a new corner of the internet that will draw in new users,” Dodge says.
Y is incredibly easy to use. Once a user uploads a photo of a face, the site opens up a library of porn videos. The vast majority feature women, though a small handful also feature men, mostly in gay porn. A user can then select any video to generate a preview of the face-swapped result within seconds—and pay to download the full version.
The results are far from perfect. Many of the face swaps are obviously fake, with the faces shimmering and distorting as they turn different angles. But to a casual observer, some are subtle enough to pass, and the trajectory of deepfakes has already shown how quickly they can become indistinguishable from reality. Some experts argue that the quality of the deepfake also doesn’t really matter because the psychological toll on victims can be the same either way. And many members of the public remain unaware that such technology exists, so even low-quality face swaps can be capable of fooling people.
Y bills itself as a safe and responsible tool for exploring sexual fantasies. The language on the site encourages users to upload their own face. But nothing prevents them from uploading other people’s faces, and comments on online forums suggest that users have already been doing just that.
The consequences for women and girls targeted by such activity can be crushing. At a psychological level, these videos can feel as violating as revenge porn—real intimate videos filmed or released without consent. “This kind of abuse—where people misrepresent your identity, name, reputation, and alter it in such violating ways—shatters you to the core,” says Noelle Martin, an Australian activist who has been targeted by a deepfake porn campaign.

https://www.technologyreview.com/2021/09/13/1035449/ai-deepfake-app-face-swaps-women-into-porn/
 

arifkarim

Prime Minister (20k+ posts)

A horrifying new AI app swaps women into porn videos with a click​

Deepfake researchers have long feared the day this would arrive.

Update: As of September 14, a day after this story published, Y posted a new notice saying it is now unavailable. We will continue to monitor the site for more changes.

The website is eye-catching for its simplicity. Against a white backdrop, a giant blue button invites visitors to upload a picture of a face. Below the button, four AI-generated faces allow you to test the service. Above it, the tag line boldly proclaims the purpose: turn anyone into a porn star by using deepfake technology to swap the person’s face into an adult video. All it requires is the picture and the push of a button.

MIT Technology Review has chosen not to name the service, which we will call Y, or use any direct quotes and screenshots of its contents, to avoid driving traffic to the site. It was discovered and brought to our attention by deepfake researcher Henry Ajder, who has been tracking the evolution and rise of synthetic media online.

For now, Y exists in relative obscurity, with a small user base actively giving the creator development feedback in online forums. But researchers have feared that an app like this would emerge, breaching an ethical line no other service has crossed before.

From the beginning, deepfakes, or AI-generated synthetic media, have primarily been used to create pornographic representations of women, who often find this psychologically devastating. The original Reddit creator who popularized the technology face-swapped female celebrities’ faces into porn videos. To this day, the research company Sensity AI estimates, between 90% and 95% of all online deepfake videos are nonconsensual porn, and around 90% of those feature women.
As the technology has advanced, numerous easy-to-use no-code tools have also emerged, allowing users to “strip” the clothes off female bodies in images. Many of these services have since been forced offline, but the code still exists in open-source repositories and has continued to resurface in new forms. The latest such site received over 6.7 million visits in August, according to the researcher Genevieve Oh, who discovered it. It has yet to be taken offline.
https://www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/
After years of activists fighting to protect victims of image-based sexual violence, deepfakes are finally forcing lawmakers to pay attention.
There have been other single-photo face-swapping apps, like ZAO or ReFace, that place users into selected scenes from mainstream movies or pop videos. But as the first dedicated pornographic face-swapping app, Y takes this to a new level. It’s “tailor-made” to create pornographic images of people without their consent, says Adam Dodge, the founder of EndTAB, a nonprofit that educates people about technology-enabled abuse. This makes it easier for the creators to improve the technology for this specific use case and entices people who otherwise wouldn’t have thought about creating deepfake porn. “Anytime you specialize like that, it creates a new corner of the internet that will draw in new users,” Dodge says.
Y is incredibly easy to use. Once a user uploads a photo of a face, the site opens up a library of porn videos. The vast majority feature women, though a small handful also feature men, mostly in gay porn. A user can then select any video to generate a preview of the face-swapped result within seconds—and pay to download the full version.
The results are far from perfect. Many of the face swaps are obviously fake, with the faces shimmering and distorting as they turn different angles. But to a casual observer, some are subtle enough to pass, and the trajectory of deepfakes has already shown how quickly they can become indistinguishable from reality. Some experts argue that the quality of the deepfake also doesn’t really matter because the psychological toll on victims can be the same either way. And many members of the public remain unaware that such technology exists, so even low-quality face swaps can be capable of fooling people.
Y bills itself as a safe and responsible tool for exploring sexual fantasies. The language on the site encourages users to upload their own face. But nothing prevents them from uploading other people’s faces, and comments on online forums suggest that users have already been doing just that.
The consequences for women and girls targeted by such activity can be crushing. At a psychological level, these videos can feel as violating as revenge porn—real intimate videos filmed or released without consent. “This kind of abuse—where people misrepresent your identity, name, reputation, and alter it in such violating ways—shatters you to the core,” says Noelle Martin, an Australian activist who has been targeted by a deepfake porn campaign.

https://www.technologyreview.com/2021/09/13/1035449/ai-deepfake-app-face-swaps-women-into-porn/
سندھ کی ترقی کے پیچھے بھی یہی ڈیپ فیک ٹیکنالوجی ہے کھوتے
 

The Sane

Chief Minister (5k+ posts)

A horrifying new AI app swaps women into porn videos with a click​

Deepfake researchers have long feared the day this would arrive.

Update: As of September 14, a day after this story published, Y posted a new notice saying it is now unavailable. We will continue to monitor the site for more changes.

The website is eye-catching for its simplicity. Against a white backdrop, a giant blue button invites visitors to upload a picture of a face. Below the button, four AI-generated faces allow you to test the service. Above it, the tag line boldly proclaims the purpose: turn anyone into a porn star by using deepfake technology to swap the person’s face into an adult video. All it requires is the picture and the push of a button.

MIT Technology Review has chosen not to name the service, which we will call Y, or use any direct quotes and screenshots of its contents, to avoid driving traffic to the site. It was discovered and brought to our attention by deepfake researcher Henry Ajder, who has been tracking the evolution and rise of synthetic media online.

For now, Y exists in relative obscurity, with a small user base actively giving the creator development feedback in online forums. But researchers have feared that an app like this would emerge, breaching an ethical line no other service has crossed before.

From the beginning, deepfakes, or AI-generated synthetic media, have primarily been used to create pornographic representations of women, who often find this psychologically devastating. The original Reddit creator who popularized the technology face-swapped female celebrities’ faces into porn videos. To this day, the research company Sensity AI estimates, between 90% and 95% of all online deepfake videos are nonconsensual porn, and around 90% of those feature women.
As the technology has advanced, numerous easy-to-use no-code tools have also emerged, allowing users to “strip” the clothes off female bodies in images. Many of these services have since been forced offline, but the code still exists in open-source repositories and has continued to resurface in new forms. The latest such site received over 6.7 million visits in August, according to the researcher Genevieve Oh, who discovered it. It has yet to be taken offline.
https://www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/
After years of activists fighting to protect victims of image-based sexual violence, deepfakes are finally forcing lawmakers to pay attention.
There have been other single-photo face-swapping apps, like ZAO or ReFace, that place users into selected scenes from mainstream movies or pop videos. But as the first dedicated pornographic face-swapping app, Y takes this to a new level. It’s “tailor-made” to create pornographic images of people without their consent, says Adam Dodge, the founder of EndTAB, a nonprofit that educates people about technology-enabled abuse. This makes it easier for the creators to improve the technology for this specific use case and entices people who otherwise wouldn’t have thought about creating deepfake porn. “Anytime you specialize like that, it creates a new corner of the internet that will draw in new users,” Dodge says.
Y is incredibly easy to use. Once a user uploads a photo of a face, the site opens up a library of porn videos. The vast majority feature women, though a small handful also feature men, mostly in gay porn. A user can then select any video to generate a preview of the face-swapped result within seconds—and pay to download the full version.
The results are far from perfect. Many of the face swaps are obviously fake, with the faces shimmering and distorting as they turn different angles. But to a casual observer, some are subtle enough to pass, and the trajectory of deepfakes has already shown how quickly they can become indistinguishable from reality. Some experts argue that the quality of the deepfake also doesn’t really matter because the psychological toll on victims can be the same either way. And many members of the public remain unaware that such technology exists, so even low-quality face swaps can be capable of fooling people.
Y bills itself as a safe and responsible tool for exploring sexual fantasies. The language on the site encourages users to upload their own face. But nothing prevents them from uploading other people’s faces, and comments on online forums suggest that users have already been doing just that.
The consequences for women and girls targeted by such activity can be crushing. At a psychological level, these videos can feel as violating as revenge porn—real intimate videos filmed or released without consent. “This kind of abuse—where people misrepresent your identity, name, reputation, and alter it in such violating ways—shatters you to the core,” says Noelle Martin, an Australian activist who has been targeted by a deepfake porn campaign.

https://www.technologyreview.com/2021/09/13/1035449/ai-deepfake-app-face-swaps-women-into-porn/
The app is created by Baadshah.
 

ranaji

President (40k+ posts)
فرمودات اعلئی حضرت کنجر اوّل آستانہ گیراج شریف قطریاں والی سرکار