Good. Lawyers, bankers, accountants, insurance agents, realtors, car salesman, loan officers. All PARASITES who dont actually do any real work. Make rules and regulations so complicated they can keep their made up jobs.
78 percent of litigants in court are self-represented, so I think AI can be very useful for them to get help with minor legal tasks to make the judicial process much easier.
I work in a court system and am pretty excited about the possibilities. We’re looking to help people who don’t have access to lawyers, not replace them. This also addresses our staffing issues, as everyone wants a fully remote job or jumped ship for bigger salaries.
And these self-represented litigants find themselves in a process where the judges and lawyers have paraphrased the law in their procedural jargon, and can't be bothered to read actual statutes unless you go through the tortuous process of making them do it. It is ridiculous.
AI is no help there. The real problem is the root of our court system is conflict-based. That is, you will never get a thorough, comprensive review of the law unless you bring a strong case that makes a judge dust off the book and actually read it. They are too high and mighty to admit error to a layman.
These people are the biggest frauds in the world and, as they say, will be the first against the wall when the revolution comes.
This post was edited 8 minutes after it was posted.
Reason provided:
to bring a strong case means you have $$$.
I’m a lawyer and can already see the writing on the wall.
I asked GPT to write an amicus brief for a case Im currently following. It did. And it cited specific (and relevant) case law and basically wrote it the entire thing without any incoherence or logical errors.
If you’re not a lawyer, let me put that into context. A decent amicus brief can take days to research and write. It’s substantive legal work.
The GPT brief wasn’t perfect and required some editing but the fact that it wrote 90% of it is absolutely mind blowing.
Not sure what to make of all this. I’m confused and a bit scared. I can’t see a scenario where many professionals (including lawyers) are made obsolete in the coming years.
I don't think it'll go that fast. Some clients of mine have used AI to ask legal questions and added their answers to the questions they sent me. Although the answers were pretty close to being right, there were some definite details that were plain wrong and would have made them lose the case if they followed it. You still need some type of human control to check what is being said and done. As you know the quality of legal advice lies in the details, and using a specific word instead of another or putting more emphasis on one thing instead of another can really make a difference. Because of that, I don't see people thrusting AI for this.
I’m a lawyer and can already see the writing on the wall.
I asked GPT to write an amicus brief for a case Im currently following. It did. And it cited specific (and relevant) case law and basically wrote it the entire thing without any incoherence or logical errors.
If you’re not a lawyer, let me put that into context. A decent amicus brief can take days to research and write. It’s substantive legal work.
The GPT brief wasn’t perfect and required some editing but the fact that it wrote 90% of it is absolutely mind blowing.
Not sure what to make of all this. I’m confused and a bit scared. I can’t see a scenario where many professionals (including lawyers) are made obsolete in the coming years.
I don't think it'll go that fast. Some clients of mine have used AI to ask legal questions and added their answers to the questions they sent me. Although the answers were pretty close to being right, there were some definite details that were plain wrong and would have made them lose the case if they followed it. You still need some type of human control to check what is being said and done. As you know the quality of legal advice lies in the details, and using a specific word instead of another or putting more emphasis on one thing instead of another can really make a difference. Because of that, I don't see people thrusting AI for this.
I don't agree. I've played around with ChatGPT and Sydney and I actually don't see much of a use for it. It is interesting and fun to do, but feels like a fad. Ultimately, ChatGPT says things that are wrong because it just pulls and amalgamates data from web searches.
So what you get with ChatGPT is a condensed Google/Bing search but without being able to see that the response is mainly built off of a ping from a random forum poster. That, ultimately, makes ChatGPT something people don't trust as a source of information. The phrase, "This looks like it was written by ChatGPT," is already being used to discredit arguments, articles and statements people don't like.
People need to trust their lawyers. They're not going to trust "the internet" to handle their sensitive matters. ChatGPT might write a few wills for people, but those will generally be people who have limited means and wouldn't otherwise write a will. And the assistance ChatGPT will provide on that will be limited to formatting, as the user will still need to say "split my estate between my two children and my sister."
And ChatGPT's potential intellectual theft of intellectual property will generate jobs for lawyers for years. For those of us who remember how Napster was going to ruin the music industry, this feels familiar. Napster ultimately lost its lawsuit, and the result of that lawsuit became the de facto regulation surrounding the music industry online. I suspect it will be the same with ChatGPT. In fact, the Napster lawsuit may well be the precedent that is cited in resolving the ChatGPT lawsuits.
15 years ago I was told (by an "expert") that teachers were going to be replaced by computers within 5 years and all they would need is an adult in the room to babysit the kids.
Here I am, happily retired subbing for teachers who come up with lesson plans and activities.
There is something about having a human in the room in some professions.
Good, there are already too many lawyers out there. Time to cull the herd. We can start with Congress. AI could replace most of these turds now as they don't do squat anyway.
A federal judge in New York City has ordered two lawyers and their law firm to show cause why they shouldn’t be sanctioned for submitting a brief with citations to fake cases, thanks to research by ChatGPT.
Or we could end up having a greater number of lawyers. We’ll still still need lawyers to interpret cases and statutes, but we’ll need even more lawyers to review AI interpretations of cases and statutes.
The citations are often fake. It can get general principles right, but when naunance is involved it will often be confidently wrong.
Granted, the same could be said about many lawyers.
I've seen that firsthand! Their job is to win, not to be correct.
Some of the nonsense they come up with gets accepted by the courts and decides cases for years with nobody noticing. The system is a bureaucracy, nobody will fix things until some effort is made to make them do it.
AI will allow the readers of briefs (eg judges, opposing counsel) to check citations etc more efficiently than they do now.
Good. Lawyers, bankers, accountants, insurance agents, realtors, car salesman, loan officers. All PARASITES who dont actually do any real work. Make rules and regulations so complicated they can keep their made up jobs.
I wouldn't say all of those are the problem jobs but generally agree with the sentiment. Our state run medicaid system decided to require a new certification that does nothing to help the people we serve and only serves to siphon taxpayer money into bureaucratic nonsense. From what I've seen this is how employers across the board operate. Maybe lawyers could be useful if they actually served the common citizen instead of protecting corporate assets
The task I use ChatGPT most for is doing the boring parts of coding. It’s not perfect but if you know enough to double-check its work, it’s a huge time-saver.
how many iterations will it be until the average person from a coding boot camp is worse than the machine? 3? 4? In that world, how will software engineers get a foot in the door?
Thomsom Reuters (Westlaw) has announced a joint venture with Microsoft 365 Copilot to use AI in drafting and research. I actually think that this could potentially be very good for the legal profession. If AI can help reduce the time needed to do research, briefing, prepare transactional documents, people would be able to have their disputes resolved or transactions negotiated for a much lower attorney fee bill. Lowering the cost of hiring a lawyer opens up the opportunity to be represented by an attorney to a much larger segment of the population who previously would not be able to afford representation and would either go pro se or just give up on a claim. Every day in my practice I see people with legitimate claims or defenses just give up because the attorney's fees that will be involved bringing the dispute to court are too much. If I can do the work for cheaper by using AI, I would probably net out the same if not see more work due to the number of cases that would normally bow out over cost that could now proceed due to attorney's fees being more reasonable. That makes a big difference for the little guy who often gets steam rolled by big business because of the feats of strength involved in litigation or negotiating a contract.
I will say that what I have seen of AI in the legal "space" like Law ChatGPT is not very good. All lawyers have forms for pleadings, correspondences, contracts, etc. that they have built over the years. Law ChatGPT's work product is actually pretty bad for transactional work as it seems to want to pile on boilerplate and has no ability to understand how provisions work together. I have seen a Law ChatGPT contract that had very conflicting evergreen term renewal provisions and cancellation/termination notice provisions. The latter basically made the former useless and Law ChatGPT did not understand that. The research and briefing is only helpful for basic stuff but cannot get into much depth as Law ChatGPT is only as good as the data it is mining and mimicking. If you have an issue of first impression, split in authorities or an issue that comes from a different angle than what courts have addressed, Law ChatGPT might find a good case faster that searching Westlaw/Lexis, but it cannot make an argument for you that will persuade a judge to stick her neck out and risk reversal on appeal to grant your motion for summary judgment.
I have a feeling that if instead of training on the whole internet, you only trained on your states case laws most of the issues would go away.
That's not how LLMs like ChatGPT work. You would get an even stupider ChatGPT without all the stolen copyrighted text it was trained on because it wouldn't be enough material. And even if you did only use state law as the training material, LLMs like ChatGPT inherently fabricate fake things because they don't actually have any concept of understanding. They are just advanced autocomplete with random number generators thrown in for variety.
Yes but the less garbage you put in, the less you get out.
And if you want to get philosophical, do people really understand stuff or do they do a bunch of pattern matching?