Wednesday, July 17, 2019

Ethics Paper on Facebook Beacon

Abstr work out Facebook began in February 2004. It was founded by bulls eye Zuckerberg and his collegiate comrades at Harvard University. Rapidly expanding, Facebooks exp intial harvest led to its rank and file growing to e very(prenominal)where atomic number 53 i thousand million drug employmentrs, as of kinfolk 2012. One would see that with such harvest-festival would return more(prenominal) opportunities for ch each(prenominal)enges to arise iodin of these chall(a)enges existence white plaguers and their rights to retirement. As part of Facebooks de none system, their main(a) means of generating revenue, beacon sent selective information from some separate companies and web places to Facebook.Following a lawsuit, beam thusly diversenessd to accommodate these requests. On December 5, 2007, Facebook declare it would book users to favor not to inscribe in shine in which the proprietor of Facebook apologized for the dispute. When come near a resulta nt to this honourable dilemma, Facebook had a couple alternative finales they could make. They could depart from the shine system as it was, they could change the beacon hold to accommodate requests, or they could delete the sustain all together.Using the assorted ethical approaches to model which takeion was take up in determine how to solve their issues one pick would found most(prenominal) logical. to a lower place the utilitarian approach, choosing to move on the shine vaunt would not be the crush conclusiveness to make. Changing the pharos mark of speech to allow users to withdraw to participate or opt forbidden of using the feature. In the end, with this resource, more would be satisfied, making it the best selection infra this ethical approach.Facebooks boilersuit reasons for ever-changing the feature spanned farther than provided making users happy, it dealt with compliance with the law, so I feel that the positive way outs from the change tes tament outweigh any potential detrimental outcomes in the future. Ethical Dilemma Facebook began in February 2004. It was founded by Mark Zuckerberg and his collegiate comrades at Harvard University. Initially, the site was aimed at other Harvard students barely was eventually expanded its social status to other colleges surrounding the Boston area.Rapidly expanding, it then undefended its membership to high school students, and in the long run to anyone 13 and over. Facebooks exponential growth led to its membership growing to over one billion users, as of September 2012. One would imagine that with such growth would come more opportunities for challenges to arise one of these challenges being users and their rights to privateness. In an move to extenuate fears concerning secrecy, Facebook enabled its users to charter from a variety of privateness settings and chose how viewable their profiles are.Although Facebook requires all users to provide a user name and a project t hat butt end be accessed by anyone, users ignore regulate what other information they redeem shared is viewable, as well as who can find them in searches, through and through those privacy settings. As part of Facebooks advertising system, their primary means of generating revenue, radio beacon sent data from other companies and websites to Facebook, in hopes of sanctionting certain ads and Facebook users to communicate their activities with their online friends, some of the activities being circulated through the Facebook users refreshingsfeed.The returns creates controversy shortly after its delegate because of apprehensions concerning privacy and in November of 2007 a group, MoveOn. org, generated a group on Facebook and an online petition requiring that Facebook foreswear to circulate user activity from other websites without clear and obvious permission. Within less than ten days, the group obtained 50,000 members. Following a lawsuit, Beacon then changed to accommoda te these requests. On December 5, 2007, Facebook declared it would allow users to choose not to participate in Beacon in which the owner of Facebook apologized for the dispute. Carlson, 2010). Relevant discipline Although Facebook, and other companionable media sites, are very public in the data users can opt to share, Beacon took a sort a users right to choose what would become public. In doing so, they violated users rights. This caused a dilemma for the Facebook media heavyweight because they generate revenues through advertisements, which clutchess its services ingenuous for users. They had to formulate a way to maintenance their partners, who used Beacon to promote their businesses, eyepatch still maintaining a sensible add together of privacy for its users. McCarthy, 2007) It was this right to choose privacy that enabled Facebook to differentiate itself from other social media giants, such as MySpace, in the prime(prenominal) place. How Facebook chose to come to a res olution would affect umteen of the stakeholders in the company. The primary stakeholders that would be affected by this decision would be the Facebook federation, who would lack to still generate revenue firearm still providing this unaffectionate service for its users, and its users, who treasured to ensure the privacy of their profiles.Other stakeholders would include advertising companies, who would want to reach the vast step of Facebook users and increase the visibility of their brand. Other social media sites would be stakeholders, as well, because they could gain the users that Facebook would overlook if they did not fix the dilemma, and generate profit as an end result. (Phillips, 2007) Possible Alternatives When approaching a solution to this ethical dilemma, Facebook had a couple alternative decisions they could make.They could leave the Beacon system as it was, they could change the Beacon feature to accommodate requests, or they could delete the feature all to gether. Leaving the Beacon feature as it was would have had banish consequences for the company. They would seeming feel mass check from its users who matt-up their privacy was violated. With a plethora of free social media sites available to the public, Facebook would try losing a vast amount of membership and, thus, lose money. The second option Facebook had was to totally get rid of the Beacon feature.If they chose this option, they would likely have to find a new way to compensate for the revenue that would be lost by doing so. Remember, the main way Facebook was able to keep its services free to users, was to generate revenue through advertisements. In deleting the Beacon feature, Facebook would have to charge on one end through users or advertisers. Either way, it would be a long amount of revenue that would have to be compensated for in the short snip it would take to get rid of the Beacon feature.The last choice Facebook had would be to diversify the Beacon feature t o accommodate the requests of those who felt it violated their privacy. Doing so would be an attempt to balance the requests of the Facebook Company as well as the users who felt they postulate a greater sense of privacy when dealing with the Beacon feature. Appropriate Standards If the Facebook Company was to use the various ethical approaches to determine which option was best in determining how to solve their issue, one option would prove most logical. Under the utilitarian approach, choosing to keep the Beacon feature would not be the best decision to make.Under the utilitarian approach, the best course of execution would be the one that contributes to the greatest amount of overall happiness. The first option, keeping Beacon how it was, in the end, would moreover make the Facebook Company and the advertisers happy. Because the sheer intensity of consumers greatly outweighs that of the advertisers and solely the Facebook Company, it would not coincide with the Utilitarian Approach. The second option, however, would include changing the Beacon feature to allow users to choose to participate or opt out of using the feature.This would allow the partnership betwixt advertisers, Facebook, and its users, without having to take drastic measures. In the end, with this option, more would be satisfied, making it the best choice under this ethical approach. The third option of deleting the Beacon feature would satisfy users who were implicated about their privacy, but it would leave Facebook with the intimidate task of finding new slipway to generate revenue. Basically, it would leave one attitude satisfied, and the other with a great burden.Therefore, this option would not be best under this approach. (Anderson, 2004) Implementation of Alternative As stated above, the most ethical course of touch would be to modify the Beacon feature to satisfy both the needs of its users and the advertisers that use the feature. This alternative would include creating a method acting for users to either choose to participate in the program and in addition provide its users with all pertinent information relating to the Beacon feature so that users could make an informed decision regarding their participation with the program.This second option proven to be the best course of action because despite the fact that the different goals stakeholders have, they all had one common one to keep Facebook running. Facebook would have to produce income to maintain its functionality as a free social media site to its users. Modifying the Beacon feature would enable Facebook to use the Beacon feature, but also enable its users to make an educated decision to permit the feature to monitor their internet activity. This meets the solution for the ethical dilemma and satisfies all sides.In hallow for Facebook to implement these changes, they would need to quickly act to enlighten all potential stakeholders of the changes to the Beacon feature. Beacon would need to be all reshaped to enable Facebook users to have an obvious process to choose to participate, or not, in the service. They also would need to create incentives so that users would want to opt into the Beacon feature. The Facebook Company would need to reach out to all advertisers to prepare a way to reach the users of Facebook without the initial problem of violating their privacy.Reflection The immediate outcome that comes to read/write head would be the satisfaction of the users who felt their rights to privacy were violated. This would have a positive extend to on the Facebook Companys image. It would surface that Facebook is a company that cares about its users and not only one that cares about its empennage line. The option of modifying the Beacon feature would be one that the Facebook Company could be regal of. This choice, following the Utilitarian Approach, maximizes the happiness of the most people involved overall.One potential outcome that is unintended would b e the future demands that could come. Because Facebook adage the potential consequences of unhappy users, they quickly devised a way to satisfy the needs of these users. Consequently, the abutting time they implement a feature that makes its users unhappy, they may have to deal with the demands that qualification be enforced later. Overall, the Beacon feature was more than a feature that make users unhappy it actually violated one of their fundamental rights their right to privacy.Facebooks overall reasons for changing the feature spanned farther than just making users happy, it dealt with compliance with the law, so I feel that the positive outcomes from the change testament outweigh any potential oppose outcomes in the future. References Facebook. In (2012). Wikipedia. Retrieved from http//en. wikipedia. org/wiki/Facebook Facebook beacon. In (2012). Wikipedia. Retrieved from http//en. wikipedia. org/wiki/Facebook_Beacon Martin, K. (n. d. ). Facebook (a) Beacon and privacy. Cor porate Ethics, Retrieved from http//www. corporate-ethics. org/publications/case-studies/ McCarthy, C. (2007).Facebook announces modifications to beacon advertising program. CNET, Retrieved from http//news. cnet. com/8301-13577_3-9826724-36. hypertext markup language Carlson, N. (2010). The full story of how facebook was founded. Business Insider, Retrieved from http//www. businessinsider. com/how-facebook-was-founded-2010-3 Phillips, S. (2007, July 24). A brief history of facebook. The Guardian, Retrieved from http//www. guardian. co. uk/technology/2007/jul/25/media. newmedia Anderson, K. (2004). Probe ministries. Retrieved from http//www. probe. org/site/c. fdKEIMNsEoG/b. 4224805/k. B792/Utilitarianism_The_Greatest_Good_for_the_Greatest_Number. htm

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.