# Input size of the DMRG and Hubbards

0 votes
asked

In sample/dmrg.cc, the input size "N" is 100 by default; in sample/hubbards.cc, Nx = 8 and Ny = 4 by default. Does it make sense to increase the size "N" from 100 to 100000, or Nx and Ny from 8 and 4 to 128 and 64 in those applications, respectively?

## 1 Answer

+1 vote
answered by (250 points)

Answers of Miles:

Yes you certainly can change N because those are just examples. As far as the question of whether it makes sense, I could interpret that question in a few different ways:

increasing N, or Nx,Ny will definitely make the program use more time and memory. For a 1d calculation, increasing N will have a linear cost, so not too bad, though if N is 100,000 then the cost will be 1000 times what it is now, which could be many hours or even days. For a 2d calculation, increasing Nx has a similar cost, but increasing Ny will impose an exponential cost, so you won't be able to increase it much beyond about Ny = 10 without running out of memory or computer time.

in terms of whether it makes sense physically to increase N that much, usually it does not. Often you can determine the bulk behavior of most lattice models on systems of linear dimension about 100 at most, and even much smaller. The reason is that for gapped systems, the correlation length is often about 10 lattice sites or less, except very close to critical points. At or near a critical point or gapless phase, it can make sense to study larger systems because of the slow, power-law scaling of correlations and strong boundary effects, but even here one can often extract a lot of information on smaller systems by doing detailed scaling analyses and extrapolations.

It's a very general question and answer, and a more specific answer depends on the system you are studying and what physical property you are measuring. Always see what you can get out of studying smaller systems before going to larger systems, of course.