Search This Blog

Showing posts with label Software. Show all posts

Tor Browser Bug Executes Uncalled for JavaScript Codes!


The well-known Tor is allegedly experiencing some kind of bug in its mechanism. It has hence warned the users to stay vigilant as regards to the “Tor Browser Bug”, which runs JavaScript codes on various unexpected sites.

Tor (originally Team Onion Router) is a free and open-source software which chiefly works on allowing anonymous communication to users.

Reportedly, the team has been working on a solution and would roll it out as soon as it is done, but there isn’t a particular time to expect it.

One of the most critical features for the security of the Tor Browser Bundle (TBB) happens to be the ability to block the code execution of the JavaScript, mention sources.

TBB is a browser that has a set of superior privacy features majorly for concealing real IP addresses to maintain the anonymity of online users and their devices’ locations.

Owing to these features, the browser has become a go-to for the working people, especially the journalists, citizens of repressive countries and people with political agendas because after all, it is a great instrument to dodge online censorship and firewalls.

People who are against the anonymity of the users and just can’t let things be, have in the past tried several times to expose Tor Browser users’ actual IP addresses via exploits that functioned on JavaScript code.

Sources cite that while few attempts of the better nature have been successfully employed to track down criminals, others were pretty strangely executed.

And then recently, a bug was discovered in the much appreciated TBB’s security mechanism. When the browser was set to allow the use of the most supreme security level and still permitted the execution of the JavaScript code when instead it should have barred it.

It is a relief that the team of Tor is well aware of the bug and is, with dedication working towards developing a patch for it. Per sources, they also mentioned that if a user requires to “Block JavaScript” they could always disable it entirely.

As per reports, the procedure for doing the above-mentioned is to open the “about config” and search for “javascript.enabled”. If here the “Value” column mentions “false” it means that the JavaScript is disabled and if it mentions “true” then right-click to select “Toggle” or double click on the row to disable it.

Researchers And Army Join Hands to Protect the Military’s AI Systems


As an initiative to provide protection to the military's artificial intelligence systems from cyber-attacks, researchers from Delhi University and the Army have joined hands, as per a recent Army news release. 

As the Army increasingly utilizes AI frameworks to identify dangers, the Army Research Office is investing in more security. This move was a very calculated one in fact as it drew reference from the NYU supported CSAW HackML competition in 2019 where one of the many major goals was to develop such a software that would prevent cyber attackers from hacking into the facial and object recognition software the military uses to further train its AI.

MaryAnne Fields, program manager for the ARO's intelligent systems, said in a statement, "Object recognition is a key component of future intelligent systems, and the Army must safeguard these systems from cyber-attack. This work will lay the foundations for recognizing and mitigating backdoor attacks in which the data used to train the object recognition system is subtly altered to give incorrect answers."


This image demonstrates how an object, like the hat in this series of photos, can be used by a hacker to corrupt data training an AI system in facial and object recognition.

The news release clearly laid accentuation on a very few important facts like, “The hackers could create a trigger, like a hat or flower, to corrupt images being used to train the AI system and the system would then learn incorrect labels and create models that make the wrong predictions of what an image contains.” 

The winners of the HackML competition, Duke University researchers Yukan Yang and Ximing Qiao, created a program that can 'flag and discover potential triggers'. And later added in a news release, "To identify a backdoor trigger, you must essentially find out three unknown variables: which class the trigger was injected into, where the attacker placed the trigger and what the trigger looks like," 

And now the Army will only require a program that can 'neutralize the trigger', however, Qiao said it ought to be "simple:" they'll just need to retrain the AI model to ignore it. 

And lastly, the software's advancement is said to have been financed by a Short-Term Innovative Research that grants researchers up to $60,000 for their nine months of work.

An App Which Could Have Meant For Any Woman to Be a Victim of Revenge Porn Taken Down By the Developers



An app created solely for "entertainment" a couple of months back, won attention as well as criticism. It professed to have the option to take off the clothes from pictures of women to make counterfeit nudes which implied that any woman could be a victim of revenge porn.

Saying that the world was not prepared for it the app developers have now removed the software from the web and wrote a message on their Twitter feed saying, "The probability that people will misuse it is too high, we don't want to make money this way."

Likewise ensuring that that there would be no different variants of it accessible and subsequently withdrawing the privilege of any other person to utilize it, they have also made sure that any individual who purchased the application would get refund too.

The program was accessible in two forms - a free one that put enormous watermarks over made pictures and a paid rendition that put a little "fake" stamp on one corner.

Katelyn Bowden,  founder of anti-revenge porn campaign group Badass, called the application "terrifying".

"Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo, this tech should not be available to the public, “she says.

The program apparently utilizes artificial intelligence based neural networks to remove clothing from the images of women to deliver realistic naked shots.

The technology is said to be similar to that used to make the so-called deepfakes, which could create pornographic clips of celebrities.