- Journal Archives
- Volume 17
- Volume 16
- Volume 15
- Volume 14
- Volume 13
- Volume 12
- Volume 11
- Volume 10
- Volume 9
- Volume 8
- Volume 7
- Volume 6
- Volume 5
- Volume 4
- Volume 3
- Volume 2
- Volume 1
At the end of August, the GAO finally issued its mandated report to Congress regarding factors that affect patent infringement litigation. Why? Patent litigation has increased dramatically over the past decade. From 2000-2011, over 29,000 patent suits were filed in U.S. district courts, and the overall number of defendants in these cases increased by 129 percent from 2007-2011 alone.
Before the study’s release, patent troll suits were thought to be the primary cause of the spike—and Congress was pushing forward with a variety of bills seeking to curb these. However, the GAO’s findings reached a different conclusion [PDF]: software patent mechanisms, not patent troll suits, were most responsible for the rise in litigation rates. A more efficient means of counteracting rising patent litigation rates, the report surmised, would be for the PTO simply to improve the quality and examination of software patents.
Behind the GAO’s innovative findings is the Palo Alto-based Lex Machina, an innovative start-up that provides IP litigation data and predictive analytics to its clients.
Lex Machina, meaning “Law Machine” in Latin, takes a qualitative approach to IP litigation. Its goal is to bring “openness and transparency” to IP law and better inform future policy by mining data and documents from: PACER, all 94 district court sites, the International Trade Commission’s electronic database, and the Patent and Trademark Office. After accessing all of these records (over 165,000 lawsuits and counting) and converting them into searchable PDF files, Lex Machina uses natural language and machine learning tools to classify cases and aggregate data.
CEO Joshua Baker describes Lex Machina’s data-predictive capabilities as bringing “Moneyball” statistics to politics, business strategy, and the law. Traditional legal research, from policy making to forum shopping, relied primarily on case law and professional experience. Lex Machina’s data, on the other hand, can immediately and efficiently provide a firm with statistics on a particular judge or adversary’s tendencies; a regulatory body with analytics evaluating a proposed regulation’s cost; or a hiring entity with every individual attorney’s performance record.
Yes, that’s right, they know your batting average.
How is this going to change legal decision-making (besides terrifying IP lawyers right out of the profession)? Litigation costs—quality versus trolling aside—are more prohibitive than ever and businesses are looking for relief. Many start-ups are finding themselves buried in $1-2 million lawsuits. Some larger companies spent $21 billion in litigation fees in 2011 alone. Lex Machina can deliver targeted services to help them select project managers and outside counsel, determine how hard to push back against a plaintiff, and evaluate how their litigation is proceeding in relation to similar past cases.
IP firms likewise find themselves in an increasingly competitive environment in an extremely fast-paced and changing industry. Once again, Lex Machina aims to create a competitive advantage. It can quickly provide information to a firm before it pitches to a company and then can supply data support to enable the firm to make effective decisions throughout litigation.
While Lex Machina and other members of the burgeoning legal tech industry may transform the delivery of legal services, legal acumen will not be forsaken. At the end of the day, Lex Machina only provides hard data and predictions. Indeed, there were wide discrepancies in interpretation over the GAO report’s data. For instance, the GAO interpreted Lex Machina’s findings to determine that patent trolls only brought about 20 percent of patent infringement suits. Meanwhile, Lex Machina’s Director of Legal Analytics and its founding CEO co-authored an academic paper relying on the same data, which found that patent trolls brought closer to 40 percent of suits. In case anyone is interested, footnote 35 of the report attributes this whopping 20 percent gap to “differences in methodology.”
Determining how such differences should be weighted and implemented as a matter of policy is a task left to members of Congress for another day. We’ll just have to wait and see how that works out.
–Christine M. Carletta
[We maintain that this image is (1) not copyrightable for lack of Feistian originality (text); and (2) that ours is a fair use based on three of the four 17 USC § 107 factors: nonprofit educational purpose; the data-centric nature of the work; and the lack of (negative) effect on the market for the work. --Ed.]
Recent Blog Posts
- EU Charges Google with Antitrust Violations
- After Adobe, will more data breach cases survive a standing challenge?
- Can the FCC Create Net Neutrality?
- AT&T Levied with the Largest Privacy and Data Security Action the FCC has Ever Taken
- MLBPA Contemplates Legal Action Against the Cubs
- Monday Morning JETLawg
Tagsadvertising antitrust Apple books career celebrities contracts copyright copyright infringement courts creative content criminal law entertainment Facebook FCC film/television financial First Amendment games Google government intellectual property internet JETLaw journalism lawsuits legislation media medicine Monday Morning JETLawg music NFL patents privacy progress publicity rights radio social networking sports Supreme Court of the United States (SCOTUS) technology telecommunications trademarks Twitter U.S. Constitution