Browsing by Author "Ma, David"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Open Access Does Domain Highlighting Help People Identify Phishing Sites(2010-10-05T14:52:52Z) Lin, Eric; Greenberg, Saul; Trotter, Eileah; Ma, David; Aycock, JohnPhishers are fraudsters that mimic legitimate websites to steal user’s credential information and exploit that information for identity theft and other criminal activities. Various anti-phishing techniques attempt to mitigate such attacks. Domain highlighting is one such approach recently incorporated by several popular web browsers. The idea is simple: the domain name of an address is highlighted in the address bar, so that users can inspect it to determine a web site’s legitimacy. Our research asks a basic question: how well does domain highlighting work? To answer this, we showed 22 participants 16 web pages typical of those targeted for phishing attacks, where participants had to determine the page’s legitimacy. In the first round, they judged the page’s legitimacy by whatever means they chose. In the second round, they were directed specifically to look at the address bar. We found that participants fell into 3 types in terms of how they determined the legitimacy of a web page; while domain highlighting was somewhat effective for one user type, it was much less effective for others. We conclude that domain highlighting, while providing some benefit, cannot be relied upon as the sole method to prevent phishing attacks.Item Open Access Evaluating Usage Expertise Mined from Version Archives(2012-10-04) Ma, David; Sillito, Jonathan; Zimmermann, ThomasOne approach for modelling coding expertise is to quantify the knowledge accrued from the use of library functionality. This concept is known as Usage Expertise (Schuler and Zimmermann 2008). This thesis makes three contributions. The first is a formal specification of a system which mines Usage Expertise from a version control repository in order to recommend developers for a change task. The second contribution is a comparison of the accuracy of the system measured against the oft-used Line 10 model of developer expertise. This evaluation finds that the usage model yields simultaneous gains in the accuracy and the diversity of recommendations. The third and final contribution is a qualitative study that explores the trust and behavioural tendencies of 9 software developers who were given the model reified as a software tool. The study finds Usage Expertise to be a trustworthy identifier of expertise. However, the study also finds a series of social and organizational factors that limit the efficacy of the model in real world contexts.Item Open Access Expert Recommendation with Usage Expertise(2009-07-09T17:28:15Z) Ma, David; Schuler, David; Zimmermann, Thomas; Sillito, JonathanGlobal and distributed software development increases the need to find and connect developers with relevant expertise. Existing recommendation systems typically model expertise based on file changes (implementation expertise). While these approaches have shown success, they require a substantial recorded history of development for a project. Previously, we have proposed the concept of usage expertise, i.e., expertise manifested through the act of calling (using) a method. In this paper, we assess the viability of this concept by evaluating expert recommendations for the ASPECTJ and ECLIPSE projects. We find that both usage and implementation expertise have comparable levels of accuracy, which suggests that usage expertise may be used as a substitute measure. We also find a notable overlap of method calls across both projects, which suggests that usage expertise can be leveraged to recommend experts from different projects and thus for projects with little or no history.