THIRD-PARTY APPLICATIONS: TRADE-OFFS IN INVASIVE DATA SHARING
“Time to Say Goodbye. Zenly will shut down in 36 days, see you on Snap Map?” — Zenly, December 28, 2022.
On February 3, 2023, one of the most well-known social media applications, Zenly, was announced to shut down. The programme that was designed to let users essentially see their added friends’ location are rumored to have been battling the economic climate with other third-party social applications. Third party applications refer to softwares created that may access profile information to enhance benefits for the application (Midrack, 2021). One prime example is Zenly, in which the application itself strengthens user acquisition through the additional feature of allowing users to communicate with their friends whilst being able to track their live locations all the time. Despite that added attribute, these applications are collecting personal and/or private information from their users, which is beneficial for the means of the functioning application, but can also threaten their users’ safety.
In the economic world, this particular trait enters the field of information sharing. Information sharing in company-made softwares allows the possibility for positive and negative economic consequences either towards individuals and/or organizations. On one hand, individuals want the ability to protect their personal data and avoid misuse from the third-party, but the idea of being able to benefit from sharing information can enable mutually beneficial interactions. On the other hand, organizations need to gather more information or insights about their users or parties that they are interacting with, but they also risk alienating those parties with overly invasive policies (Acquisti, 2010, p. 03). The balance between sharing and hiding personal information, and between exploiting and protecting data, may involve intangible and ambiguous trade-offs, hence making it quite complex.
Privacy in The Pessimistic Eyes of Economic-Based Decision Making
Occasionally, the concept of data sharing comes unbeknownst to a group of people lacking the fundamental basis to how an application actually works. This group can unknowingly allow the companies owning the application to access the private information they provide to enjoy the features provided by said applications. At that given point, each individual’s personal decision making is tested by two crucial questions: first, the results that may come in dealing with private information; and second, the effects in question (Grossklags & Acquisti, 2007). This comes down to the possibilities dealt by consumers when faced with the issue of privacy as a risk or uncertainty, which is rooted by the market failure known as asymmetric information or hidden action where in this case, buyers do not have an upper hand in knowing the main goals of the application developers. Visualizations of the company’s strategies lean more towards problems of uncertainties faced by those consumers, prompting the likelihood of these consumers to ignore risks regarding sharing of data.
Coming across the concept of asymmetric information, behavioral economics comes into play. The innate bounded rationality owned by a human being has a huge part in choosing an outcome that is deemed “good enough” by their subconscious as they lack all possible information regarding the tactic owned by company-owners. Essentially, this is done through the use of heuristics, or mental shortcuts used by people to create quick decision-making (Nantham, 2022), as it is difficult for said individuals to put a “price tag” on their own personal information. These heuristics come into play from — for instance — the mental assumptions that there will be a low chance of bad actors based on the good designs of their applications.
All things considered, privacy or data sharing has a huge correlation with the concept of systematic bias and lack of information from the customers’ end. The main economic goal of satisfying an individual’s utility is to ensure the best choice of a market basket through the use of optimization algorithms. However, in a privacy-related context, the optimality of choice is considered to be sophisticated, particularly as the issue itself is subject to various constraints in terms of the never ending possibilities that may come.
Indulging The Perks of Third-Party Data Sharing
Essentially, the preference towards data transparency has many dimensions. In a general scope, both parties face a win-win approach for data sharing as it allows users access to the many features provided by the company’s software, whilst companies gain respective perks based on their main intention of gaining that data. Generally, data are “increasingly significant resources that can drive value creation and foster new industries” (OECD, 2015), and is a means of data-driven innovation as it can bring various perks to the developers’ final goal in creating the application itself. Additionally, third-party data sharing through data platforms are believed to have the potential in increasing the ability of companies to reuse data, which they may otherwise have difficulty obtaining access to (OECD, 2019). This statement shows that developers are dependent on winning their potential customers’ trust in order to gain the information they value at a high standard to ensure the improvements that can be brought upon by information sharing.
With that said, the potential perks brought upon to companies through the possibility of data sharing helps consumers as well by making these applications more efficient and easy to use due to the optimization of private information. Sharing data with developers may increase the efficiency of applications as it enables the creation of personalized and targeted experiences for customers, done by the processing of big data (through, for instance, consumption patterns, or trends from groups of customers), which can lead to an increased satisfaction from individual customers and, hence, recurring use of the application or customer loyalty. In an economic standpoint, prediction of aggregate trends improves the chances of observing individual behaviors and, most importantly, creates the ability to impose price discrimination for groups of customers for the purpose of maximizing profit (Varian, 1985).
The resulting targeted advertising is a positive externality for both of the parties. Firms owning the application may now cut down costs on advertising as there is no urgency for creating advertisements for consumers unlikely to respond to the vision of their application. This adds on to the effectiveness of online advertising, which is often measured through metrics such as online ad exposure, click-through behavior, and also post-exposure online behavior, allowing advertisers to continually monitor and improve their campaigns on the targeted market group. Hence, the effectiveness coming from targeted advertising maximizes revenue, with cut down costs on advertising but still a steady online presence from the groups matching similar interests towards what the developers are providing through their projected application. Adding to that, a study shows that based on Beales (2010), “the price of behaviourally targeted advertising is almost 3 times as much the price of untargeted advertising” (OECD, 2010). Similarly, consumers are now able to gain useful information based on the advertising that is suited to their interests. This will ultimately cut down the cost of reduced information as well for consumers.
To Restrict Third-Party Applications Or Favor Information Sharing?
The final debate comes to these two sides: being for or against data sharing and allowing third parties to assess one’s information. In general, the possibility of technological data sharing was something beyond the dreams of experts from previous generations. To quote Blattberg and Deighton (1991), professors from Northwestern University and Harvard Business School, respectively:
It’s a marketer’s dream — the ability to develop interactive relationships with individual customers. Technology, in the form of the database, is making this dream a reality. Now companies can keep track of customer preferences and tailor advertising and promotions to those needs.
Data sharing can provide various different benefits for both customers and firms. However, it is important to carefully consider the potential risks and uncertainties remembering that the economics to decision-making is quite different in terms of privacy sharing, especially with the lack of information from the customers’ side. From here, ensuring that one’s personal data is shared in a responsible and ethical manner is very important (especially in this case) to maximize the potential benefits coming from utilizing third-party applications. This may include taking steps to protect personal information and ensuring that it is used in accordance with applicable laws and regulations, as well as obtaining the appropriate consent from individuals.
Ultimately, the decision to share data with third-party applications differs for each individual. One thing to take into consideration is the potential benefits and risks, which should be consistent with the principles of data protection and privacy.
By Fibula Aikonadaa Patiroi | Ilmu Ekonomi 2021 | Staff Divisi Kajian Kanopi FEB UI 2022
Acquisti, A. (2010). The Economics of Personal Data and the Economics of Privacy. OECD. https://www.oecd.org/sti/ieconomy/46968784.pdf
Grossklags, J., & Acquisti, A. (2007). What can behavioral economics teach us about privacy? Digital Privacy, 363–377. https://doi.org/10.1201/9781420052183.ch18
Midrack. (2021, September 12). What’s a third-party app? Lifewire. https://www.lifewire.com/what-is-a-third-party-app-4154068
Nantham, S. (2022, November 22). Bounded rationality — Limitations and examples. Best OKR Software by Profit.co. https://www.profit.co/blog/behavioral-economics/bounded-rationality-limitations-and-examples/
OECD. (2015). How data now drive innovation. Data-Driven Innovation, 131–175. https://doi.org/10.1787/9789264229358-7-en
OECD. (2019). Enhancing access to and sharing of data. https://doi.org/10.1787/276aaca8-en
Varian, H. (1985). Price discrimination and social welfare. The American Economic Review 75(4), 870– 875.