After 20 Years, the Simulation Runs are Complete

shark.png

The final test of my thesis simulator code was a simple gaussian pulse traveling down the empty channel of a Si MOSFET. The simulation of the first picosecond took close to a month on the school's Gould NP1 supercomputer. Today I finally tried running the simulation on my MacBook Pro to see if I could get the same results as well as see how long it'd take.

I had typed in the input deck this past weekend, but hadn't had a lot of time to run it and see what the results were. Today I just fired it off in the background and came back to it every so often to see if it had completed. What I found out was that I hadn't typed in everything correctly and I needed to increase the maximum number of allowed Newton iterations to get to the convergence criteria I had in the code. No biggie, but it would have been especially nice to have gotten it right the first time.

When I finally got the upper limit set right, it turned out that I did the complete 10 psec simulation in about 72 mins. It's all single-threaded code, so there's a possibility of optimization there if I spend the time to make the matrix solution multi-threaded, but the work to do that would be significant. Still... under 2 hours versus a month for a tenth the work. Yeah, computers certainly have changed in the last 20 years, but it's times like these that really accentuate it.

I'll probably try to do some of the simulations that I simply could not do at the time - like the GaAs channel, because in those results are the real questions of the thesis work: Was it a 1D simulation effect? and What will the frequency be when taking into account the 2D v(E) vector? In the end, I know what the results should say... it's real and it's reasonable, as after I left, the next student actually got one of my designs working. But still... it's nice to be able to finally close the book on that chapter of my life.