Recent News
Celebrating UNM Research and Discovery Week 2024
November 6, 2024
New Mexico universities unite in $7 million project to develop automated additive manufacturing
November 4, 2024
Engineering professor to lead $5 million project investigating materials for safe storage of nuclear waste
October 31, 2024
From fireflies to drones: UNM researchers uncover strategy for synchronization efficiency
October 30, 2024
News Archives
Research Scientist Delves into Problem of Persuading Millions of People to Cooperate for a Secure Internet
September 9, 2010
Anyone who owns a computer plays the “Virus Inoculation Game” every day whether they want to or not. We each decide whether or not to spend money and time to protect our computer against malicious software. In a perfect world, everyone would make this effort. In the real world, many do not, hoping that the efforts of others will protect them.
The “Virus Inoculation Game” is just one of many problems in game theory that illustrate what is known as the “tragedy of the commons” effect. Specifically, selfish actions of players reduce the overall social welfare. So how do you stack the deck to convince more people to buy the protection software even if it costs them money in the short term?
UNM Associate Professor of Computer Science, Jared Saia says one way is to inject an element of fear into the game to counteract the natural selfinterest of the players. His latest research, funded by the National Science Foundation, will test the idea of using a mediator to achieve this goal.
The job of the mediator is to persuade people to act in a more collaborative way. The mediator only has the power to offer advice to each player. The players maintain their free will and may decide whether or not to follow that advice. Surprisingly, in many cases, the advice alone can be enough to eliminate the tragedy of the commons effect.
For example, the mediator might try to convince more people to buy software to protect their computers against virus attacks. Of course, if the mediator just advises everybody to buy software, few people will follow his advice. Instead, the mediator can use a more persuasive, “reactive” algorithm that will react to the actions of the players. In this scenario, the mediator tells a person, “Alice” to buy the software. If Alice ignores the advice, then with some probability, the mediator later tells several people who normally interact with Alice not to buy the software. That greatly increases the chances of Alice’s computer becoming infected with a virus. The mediator has now introduced the fear of infection into the game.
Unfortunately, a reactive algorithm may not be practical in a game with millions of anonymous players. Thus, Saia has also studied proactive algorithms. In this scenario, fear of reprisal is “baked in” to the algorithm the mediator uses to advise the players.
Specifically, when Alice is advised to inoculate, there is some small probability that, at the same time, her neighbors are all told not to inoculate. Thus, Alice has a stronger incentive to follow the mediator’s advice. If this small probability is chosen carefully for Alice, and for all other players in the network, it is possible to find the right balance of fear and greed that will convince people to act in a way that contributes to the safety of everyone.
Saia and his research group will spend the next three years finding ways to design mediators to help ameliorate the tragedy of the commons effect. He believes his approach will be useful for many types of games including finding ways to ensure sharing of resources in systems like BitTorrent; allowing multiple users of wireless frequencies to share limited bandwidth; and allowing people to cooperate to counter denial of service attacks.
Saia is also interested in the theoretical limitations of the mediator approach. He has shown that there is a certain class of games for which a mediator cannot reduce a tragedy of the commons effect, and is currently studying what properties of a game allow for effective mediation.
Saia’s research builds on his previous work in which he sought to understand the strength of group collaborations. He discovered algorithms that would allow webbased projects, in which many people work together on the same problem, to continue to function even if up to a third of the people working on the project are trying actively to destroy it. These same algorithms can now be used as the building blocks for implementing a mediator in a completely distributed fashion.
Saia works with collaborators around the world to build and test the mathematical algorithms that can be translated into useful ways to strengthen the internet and make it safer for everyone.
Media contact: Karen Wentworth (505) 277‑5627; email: kwent2@unm.edu