Recently, the tech giant Google has come with a new module for the TensorFlow which is its famous machine learning framework.  The new module in TensorFlow will help the AI developers to keep the user’s data private and secure with just a few more lines of code.

TensorFlow being one of the most popular machine learning frameworks is used by many programmers and developers around the world. From creating text, audio to image recognition algorithms, TensorFlow is being used all over in the world. The new TensorFlow Privacy module will make the developers able to keep the user’s data secure and private with the technique known as differential privacy, though different kind of antivirus you will find very important as well. You can choose to buy at amazing prices online with Snapdeal Coupons Codes.

The new module in this machine learning framework will make the AI developers follow Google’s principles of AI development. As per Google product manager, Carey Radebaugh “If we don’t get something like differential privacy into TensorFlow, then we just know it won’t be as easy for teams inside and outside of Google to make use of it.” He told to Verge and added, “So for us, it’s important to get it into TensorFlow, to open source it, and to start to create this community around it.”

Coming to the differential privacy, the working mechanism is quite complex which makes it a mathematical approach basically. Now, AI models trained on user data will not be able to encode personal information. It is a very common way to secure personal information which is required to create AI models. Apple brought it in iOS 10 for its AI services and Google uses it in Gmail’s Smart Reply feature too.

It can turn dangerous sometimes like how Smart Reply collects data from billions of users to reply in a particular way. The data collected might have personal information and if Gmail’s Smart Reply get along with it, like suggesting a reply of that email that might turn out very fishy and catches completely wrong track for users.

As per a research scientist at Google who’s been working in the field of data privacy for 20 years, ÚlfarErlingsson, “With the Differential privacy, this possibility is unchecked with mathematical certainty.” Talking to Verge, he said, “You have an outcome that is independent of any one person’s [data] but that is still a good outcome.” This technique will eradicate the identifiable outliers from datasets with changing the overall meaning of the data.

Although, there is a slippery slope too for using this differential privacy technique. As per Erlingson, “By masking outliers, it can sometimes remove relevant or interesting data, especially in varied datasets, like those involving language.” He added, “Differential privacy literally means that it’s impossible for the system to learn about anything that happens just once in the dataset, and so you have this tension. Do you have to go get more data of a certain type? How relevant or useful are those unique properties in the dataset?”

The tech giant, Google believes that by bringing TensorFlow Privacy in the market, the solution of the problem will come out as the more AI developers will be using it all across the world. Radebaugh says, “There’s work to do to make it easier to figure out this tradeoff.” Order some of best online software online if you not using genuine via Aliexpress Promo Codes to find good prices.

Google believes that it is really great to bring more brains and involve them and release new open-source tools which further increase the knowledge pool of the available programmers. Also, Erlingson says that by adding differential privacy to an AI model in just “four or five lines [of code] and some hyper-parameter tuning” is a right step forward. He added, “This is a very different sort of world to what we were in even just a few months ago, so we’re quite proud of that.”