Cookies managing
Emat EOOD, referred to in this policy as ("Emat", "we", "our", "us"), is committed to protect the privacy and security of your personally identifiable information. We advise you to carefully read this cookie policy ("Policy"), together with Emat Privacy Policy so that you are aware of how, where and why we are using your personal information.

This Policy applies to all individuals visiting our website and to all the information that is collected through cookies. Read more...
Cookies managing
Cookie Settings
Cookies allow our websites to remember information that changes the way the site behaves or looks, such as your preferred language or the region you are in. Remembering your preferences enables us to personalize and display advertisements and other contents for you.
Essential cookies
Always On. These cookies are essential so that you can use the website and use its functions. They cannot be turned off. They're set in response to requests made by you, such as setting your privacy preferences, logging in or filling in forms.
Analytics cookies
Disabled
We may use cookies to better understand how people use our products/services so that we can improve them.
Advertising cookies
Disabled
We use cookies to make advertising more engaging to our users. Some common applications of cookies are made to select advertising based on what's relevant to you, to improve reporting on campaign performance and to avoid showing ads you would have already seen. Cookies capture information about how you interact with our website, which includes the pages that you visit most.
Security/Optimization cookies
Disabled
Cookies allow us to maintain security by authenticating users, preventing fraudulent use of login credentials and protect user data from unauthorized parties. We may use certain type of cookies allow us to block many types of attacks, such as attempts to steal content from the forms present on our website.

Ethics in data analysis

Penetration audit by Emat EOOD it company
One of the most famous scandals in Data Science is the data leak of millions of Cambridge Analytica and Facebook users. It became obvious that the daily activity of any person could become part of political games and manipulation. Since then, questions of ethics in data analysis have become increasingly intertwined with the so-called advertising paradox: how to personalise a service or advertisement without crossing the line and turning data collection into intrusive tracking?

Data privacy is a core parameter of ethics in Data Science. Users need to be sure that their personal data is protected and will not be used without their consent, Emat EOOD Bulgaria experts argue.

On the other hand, in a developed IT product, behind any decision there is always a person who made an architectural, logical or business decision. The customer must be sure that his software, game, platform or application is protected from leaks, hacking and unauthorised access. That is why Emat company pays special attention to anonymity and security of user data during software development. Let's discuss ethical issues in data analytics and how we solve them in custom software development.
De-anonymisation: without a name, you know everything
It is possible to identify a particular user with high accuracy by indirect signs, even if their data is anonymised. For example, this can be done by obtaining ‘a set of a few innocent parameters’ - geolocation, age and gender. In 2006, the American company AOL ‘anonymised’ users' search queries and published them for research purposes - but it was easy to reconstruct their identities from the content of the queries.

Emat EOOD developers initially designed the architecture in such a way as to exclude the possibility of reverse identification. We use differentiated privacy techniques, partitioning and encrypting data. For the software customer, this means that information about the users of their product cannot be used in a recognisable way.

Algorithmic bias
Machine learning models are trained on historical data and can inherit (and reinforce) past biases. We know of cases when HR algorithms under-rated female candidates or made a mistake when predicting crime rates for minorities. And it was not a code error, but a data processing error.

To prevent this from happening, Emat analyses training samples, identifies imbalances, tests models, and connects interpreted algorithms.

Breach of privacy without hacking
Privacy can be breached even legally if the user does not understand exactly what they have consented to. Terms of use for services are often worded confusingly. For example, a fitness app can collect health, location, and even environmental data (through access to a camera and microphone) - and sell it to third parties.

Our company goes the opposite way when developing new software products and database solutions. When forming the terms of reference, the customer: defines the goals, the required minimum of information needed to achieve it.
Invasion of privacy
Sophisticated models can predict even aspects of life that the user did not intend to disclose: religious views, sexual orientation, income level. And then an ethical question arises: does the company have the right to ‘know’ about the user what he or she has not yet realised?

Users are sometimes ‘pushed’ into agreeing to data processing. Small print, non-obvious opt-out buttons, aggressive reminders - a set of manipulative techniques and confusing interface solutions instead of an honest dialogue. For the customer, this means: their product will not cause users anxiety or annoyance. Emat EOOD believes that a modern software product should limit the depth of analysis and include user control mechanisms.
Collecting redundant data
Many companies collect far more data than they need. This is done deliberately in case the data is needed in the future. In reality, this increases the risk of leaks. And the user starts to doubt the necessity and expediency of such requests. The result: the customer leaves, closes the application and deletes it.

Emat EOOD offers ethical solutions for consent interfaces. The user can see what the application knows about him and, if necessary, can disable data collection.

Lack of transparency (black box algorithms)
Algorithms often work as ‘black boxes’ - even the developers themselves cannot always explain how exactly the model made this or that decision. This is especially dangerous in areas such as credit, medicine and justice. Denial of a loan or a questionable diagnosis made by a model are just two situations out of thousands of possible situations where biased algorithms can affect a person and lead to serious consequences.

Modern mobile and web applications can collect data about a user's actions, it is technically possible today: finger movements on the screen, time between taps, voice commands and even the way he or she holds the phone. Emat EOOD offers customers ethical solutions without ‘hidden cameras’ in the interface.

Ethics is convenient
Ethical issues in data usage and software product development are not solved at the last moment - it is not a filter that is ‘imposed’ before release. Emat specialists discuss ethical risks with the customer in advance, even at the stage of setting the task. For example, is it really necessary to collect such data? What harm can the model cause? How will it affect software users?

Respect for the user is built into the code from the very beginning: from what data we collect to how we explain how the algorithm works. In practice, ethical software products are user-friendly and understandable for the system user. They are not afraid to install them. They get good reviews. They are easier to scale.
See our other News
    Info
    Emat EOOD
    Bulgaria, Sofia 1404, Stolichna Municipality,
    district. Triaditsa, st. Yasna Polyana 110