Computational Complexity
Computational Complexity and other fun stuff in math and computer science from Lance Fortnow and Bill Gasarch
Wednesday, October 29, 2025
AI and the Power of Nonuniform Circuits
Sunday, October 26, 2025
Bill's Bad Advice
I sometimes give the following advice for research which I label Bill's Bad Advice. We will later see who it might be good advice for. Spoiler alert: the number of people for whom it is good advice is shrinking but might include Lance especially now (see his post about stepping down from admin, here).
When you come across a possible research topic or problem, or have some idea, and are wondering if you want to pursue it, here is my bad advice:
1) DON"T CARE if anyone else cares. If YOU care then that is enough to at least get started.
2) DON"T CARE if it has the potential for a published paper. FIRST do the work then, if you feel like it, look for a good venue. You might not bother if posting to arxiv or making an open problems column out of it or a (guest) blog post out of it is a good endpoint. (I run the SIGACT News Open Problems Column- feel free to contact me if you want to submit one.)
3) DON"T CARE if it has practical implications.
4) DON"T CARE if you can get a grant for it. With the current state of the NSF this advice may soon become irrelevant.
5) DON"T CARE if someone else already did it (though at a later stage you should check on this). Even if you work on it and find someone else did it, you will have LEARNED about the problem through your efforts. You might then want to do a survey for your own benefit to consolidate your knowledge.
Why should you NOT CARE about any of these things? Because they get in the way of actually DOING something.
Here are two examples of when this approach WORKED and one where it DID NOT WORK, though both classifications might depend on your definition of WORKED.
WORKED: My work on Muffins. All I wanted was to get some High School and Undergraduate Projects out of it. I ended up with a book on it which made me twenty dollars last year! More to the point, I learned a lot, as did my co-authors and I am happy with the book. The co-authors were Undergraduates so my dept put me up for a mentoring award (I have other credentials as well). I did not win, but I got a nice letter saying they had many qualified applicants. OH- it didn't say if I was one of them.
WORKED: I have had many Ramsey Projects where a High School Student codes stuff up and learns some Ramsey, some coding, and gets the experience of research. Sometimes they do a survey paper or open problems column. We both learn A LOT from this and the student gets a good letter from me. Do they do something NEW? Publishable? No, though some surveys and open problems columns have come out of this. I DO tell them ahead of time that the work is unlikely to lead to original results (and hence unlikely to be good for a science competition).
DID NOT WORK: See this blog post here about the math and here about finding out that the problem we were working on was already known and more was known than we thought. I didn't mind this, but one of the authors did.
Note that WORKED and DID NOT WORK also depend on your goals.
For whom is this bad advice? Good advice?
1) It was always bad advice for young assistant professors who need to get papers and grants to get tenure.
2) Hypothetically, once you get tenure you have job security and hence can change fields or follow my bad advice without consequences. But with grants and salary and teaching load issues, this is less the case. Perhaps I am nostalgic for a time that never was.
3) High School Students were my main audience for bad advice. It's not as important for them to get papers out as for (say) assistant professors. But even this is changing. Colleges are getting more competitive. And HS students may well want a project that can lead to Science competitions. I am not going to say things were better when I was a kid but instead pose non-rhetorical questions:
a) Are high school students getting into research earlier than they used to? I am sure the answer is yes.
b) Are we losing the safe space where a high school student can just learn things and do things and not worry so much about if it's publishable? Yes, but I am not sure how widespread that is.
c) If we are losing that safe space, is that a bad thing?
d) Points a,b,c apply to ugraduates who want to go to graduate school more than for high school students who want to go to college.
4) Full Professors may have more freedom to follow my bad advice. Lance is looking for things to do now that he is no longer a dean, and indeed, is back to being a teaching-and-research professor. So he might follow my advice. However, he actually cares if people care about his work. He does not have to follow all of my advice, but he can follow some of it.
Thursday, October 23, 2025
AI and Intro Theory
This fall, for the first time at Illinois Tech, I'm teaching Introduction to Theory of Computation. While I've taught a variation of this course a couple dozen times, I last taught this class Spring 2016 at Georgia Tech. Intro Theory is a course whose main theorems don't change but it feels different in the AI era.
Here are the big and little things for me.
NP as a Hard Problem?
Garey and Johnson's classic 1979 book on NP-completeness had this famous cartoon.
Monday, October 20, 2025
Sept 16, 2025 was Pythagorean Day
Several people emailed me that September 16, 2025---written as 9-16-25 in the US---represents the integer side lengths of a right triangle.
9-16-25 is the only such triple that is also a valid date. This kind of mathematical alignment only happens once every 100 years. The next occurrence will be September 16, 2125.
Since this is such a rare event, let's explore some more math-themed dates.
1) Pythagorean Triples That Work as Future Dates
Note that 9-16-25 is not a Pythagorean triple; however, 3-4-5 is.
Here are some future dates that are both Pythagorean triples and valid calendar dates:
March 4, 2105 is 3-4-5
May 12, 2113 is 5-12-13
June 8, 2110 is 6-8-10
July 24, 2125 is 7-24-25 (Darn---July 24, 2025 was recent and I missed it!)
August 15, 2117 is 8-15-17
I think that's it. Recall that we need the month to be in \(\{1,\ldots,12\}\) and the day to be in \(\{1,\ldots,31\}\) with some exceptions:
Thirty days has September, April, June, and November
All the rest have thirty-one,
Excepting February, fun!
And that has twenty-eight days clear
And twenty-nine in a Leap Year
There are 24 versions of this poem at a website which is here.
2) Why Didn't Anyone Email Me About Earlier Dates?
I wonder why nobody emailed me on, say, March 4, 2005 (3-4-5). That's a Pythagorean triple, but maybe it just looked like three consecutive numbers. Oh well.
And what about May 12, 2013 (5-12-13)? That's a really cool Pythagorean triple. Oh well.
3) Other Math-Related Dates Using Month, Day, and Year.
So dates like Pi Day don't count---we want the full date to be interesting mathematically. Side note---I looked up how Pi Day is referred to and its Pi Day, not \(\pi\) day. Probably because not all typesetting systems can easily produce \(\pi\).
Every year divisible by 4 is a leap year.
That is the end of my post. A bonus for my readers: a mini meta post. Long time reader and fellow SUNY Stony brook math major David Marcus (he was class of 1979, I was class of 1980) has been proofreading some of my posts before I post them. Lance suggested I used chatty instead. Instead of using chatty instead, I used chatty and then used David. David still found mistakes. I give an example here by pointing to all three versions:
Wednesday, October 15, 2025
Fall Jobs Post 2025
In two years we have gone from nearly all of our computer science graduates getting jobs in the field to many of them struggling to get internships and jobs in the top companies if at all. If the past is any guide, a weak tech job market leads to fewer majors which leads to fewer slots for computer science faculty. We'll start to see these trends this year and they will accelerate quickly if the tech jobs market doesn't recover.
In fact the tech job market for computer science graduates has become much more challenging, ironically as the tech companies have had huge market growth with advances in artificial intelligence. Academia moves slowly, some departments are still catching up on hiring, but that will soon slow. Many computer science departments are adding AI majors and specializations and computer science enrollments over the next few years will be hard to predict.
Computer science is not protected from larger forces. Many universities are facing long-term fiscal challenges, and cuts in research and student funding from the US government aren't helping, leading to hiring freezes at many schools. The demographic cliff, a drop in the US birth rate around the 2008 fiscal crisis, will start hitting colleges in earnest next year.
The Trump administration has made it more difficult for foreigners to get student visas, leading to a projected drop in incoming international students between 30% and 40%. Recent changes to H1B's, namely the $100K fee an employer would need to pay, may discourage more students who study in the US with the hopes of building a career and a life here. International students, many of whom have gone into computer science, are a major source of tuition for many universities.
Universities would also need to pay the $100K to hire foreigners as faculty and postdocs. Startup packages for new computer science faculty typically run several hundred thousand dollars so this is not out of the question, but schools might be reluctant to pay the $100K in tight fiscal times. These rules may evolve via litigation or administrative reinterpretation.
Even in a turbulent market, visibility and adaptability matter most. Universities outside the US are trying to take advantage by going after students and faculty. More than ever you should consider positions around the world, especially, but not only, if you are not a US citizen or permanent resident. Other countries have their own challenges, so tread carefully.
On to specific advice for students on the academic job market.
CATCS maintains a Theoretical Computer Science Jobs posting site. More generally in computer science, there is the CRA Database of Job Candidates and the CRA and ACM job postings.
And in general make sure you stand out. Areas related to data such as Artificial Intelligence, Data Science and Cybersecurity, will draw the most interest. Best if you can tie your research to those areas, or at least that you are open to teaching in them. Make sure your teaching statement talks about how you will handle AI in the classroom.
Have a well-designed website with all your job materials and links to your papers. Make sure your Google Scholar, LinkedIn and GitHub (if relevant) sites are accurate and up to date. Make a short video describing your research to a general computer science crowd. Take advantage of all the links above. Network at FOCS or SODA if you can get there. Reach out directly to professors you know at schools you are interested in. And start now.
Sunday, October 12, 2025
The Most Common Name in the World is Not Charles Lin. But It Seems That Way To Me.
In 2001 I supervised Charles Lin's Master's Thesis, which was on Private Information Retrieval.
In 2025 I supervised Charles Lin's Master's Thesis, which was on Ramsey Theory.
These are different people. To celebrate the second one's thesis, I took them both out to lunch together.
A Picture of Charles, Charles, and Bill is here.
I asked ChatGPT about common names.
1) ChatGPT tells me (and I believe them) that the number of people in American named Charles Lin is between 200 and 900. The answer pointed out that Charles is common and Lin is common but the combination is not common.
2) ChatGPT tells me (and I believe them) that the number of people named John Smith is between 25,000 and 40,000. I've heard it's a common name, but I do not know anyone named that. How is that possible?
3) I predict that people with last name Smith will be reluctant to name their child John because it is so common. Hence this name will decline. However, I made that same prediction 40 years ago. Maybe it did decline TO 40,000.
4) The most common boys' names in 2025 (so far) in the US are, in order:
Liam, Noah, Oliver, Theodore, James, Henry, Mateo, Elijah, Lucas, William
According to this website here, the five most common boys' names in 1960 (the year I was born), in order, are
David, Michael, John, James Robert.
James is the only on both lists.
I don't have the next five, though I suspect Henry and William would be on it. I also suspect that Liam and Mateo would not be in the top 100.
5) The most common girls' names in 2025 (so far) in the US are, in order:
Olivia, Emma, Amelia, Charlotte, Mia, Sophia, Isabella, Evelyn, Ava, Sofia
(Should they have Sophia and Sofia both on the list? I would say yes but it is odd.)
According to that same website, the most common girls' names in 1960 (the year I was born), in order, are
Mary, Susan, Linda, Karen, Donna.
None of the 1960 name are on the 2025 list. I suspect that would still true for the next 10.
6) When I was in grade school if the teacher said Quiet down Bill and Lisa the class went quiet.
7) The most common first name in the world is Mohammad. The most common last name in the world is Wang. Hence the most common name in the world has to be Mohammad Wang. I've asked students to find the flaw in that logic and all 8 Mohammad Wangs in my class found the flaw. Kudos to them! (Darling says I should always tell you when I am kidding. I am kidding. Only 2 of the 8 Mohammad Wangs in my class found the flaw.)
8) (ADDED LATER) One of my readers emailed me the following two items
a) From The Big Bang Theory
Howard: It gets better. Someone has to go up with the telescope as a payload specialist, and guess who that someone is.
Sheldon: Mohammed Lee.
Howard: Who’s Mohammed Lee?
Sheldon: Mohammed is the most common first name in the world, Lee, the most common surname. As I didn’t know the answer, I thought that gave me a mathematical edge.
b) As an interesting aside, I noticed that this semester I have three students named "Luke" and three named "Ben" which made me notice that kids who grew up with the original Star Wars movies would have kids in college around now (but none of my current students are named "Leia")
9) (ADDED LATER) I HEARD that the name `Barack' was popular when he was president.
I looked it up:
5 babies got that name in 2007
52 babies got that name in 2008
69 babies got that name in 2009
it declined since then.
BAD SCIENCE TIME: The number of babies named Barack increased 10-fold from 2007 to 2008.
TRUE but not significant.
10) I hope to supervise at least one more Charles Lin before I retire.
Wednesday, October 08, 2025
Big Bots Don't Cry
Sunday, October 05, 2025
If you use AI in your work do you brag about it or hide it?
You used AI in your work. Do you hide it or brag about it?
1) In 2002 there was a movie Simone about an actress who is really an AI. The Wikipedia entry tells the entire plot. I saved time by reading it in two minutes rather than watching it in 2 hours. I don't think I missed anything.
Key Plot Point: The creator of Simone hides the fact that Simone is an AI. Why hide it? Think of the publicity that would be generated by bragging about it!
The following comment would have been clever in 2002 but is so obvious now that it's banal:
Hiding that it's an AI actress is more unrealistic than having an AI actress.
Fast Forward to 2025 and into reality: There IS an AI actress named Tilly Norwood (do AIs need last names? Apparently yes.) She has not appeared in any movies yet but she will be on Instagram and other media soon. Are the creators trying to hide that she is not human? Quite the contrary---they are bragging about it. The Screen Actors Guild is complaining about this, see here. The complaint is that she can't be a real actress since she has no real life experiences to draw upon. If they are saying she won't be a good actress, that's for the studios and the audience and the critics to decide. If a new technology is threatening your livelihood then the argument it's not very good is a terrible argument since it may well be false and is not really what your concern is.
2) Recently Scott used AI GPT-5-thinking to help him with a proof. Did he hide this fact? Quite the contrary, he pointed it out as an interesting application of AI. See his blog post here.
3) There was a Chemistry Nobel prize, and a Physics Nobel prize, for work done where AI played a role. Did they try to hide that AI was used? Quite the contrary. See Lance's post on this, here.
4) Do you know of any case where AI or a computer was used and the authors wanted to hide that fact? Aside from Chess players using a computer, and students using ChatGPT, I cannot. Not for any serious research. (My spellcheck thinks ChatGPT is not a word. This time spellcheck is wrong.)
5) The opposite has happened: The Mechanical Turk chess-playing "machine". See here.
6) I recently saw the movie I, Robot from 2004. The main human character is against robots and says Can a robot write a symphony? He means this to be a rhetorical question whose answer is no. I counter: can a computer write a movie as dumb as I, Robot? Simone?
