Memory issues
I loaded a mat file which was converted using doctomat. This file has 9 rows and 17000 columns. When I tried to do clustering, I get some errors like memory corruption and gcluto crashes immediately. Please tell me whether gcluto can handle such huge files. What can be the maximum size of the file?
Thanks in advance.
Submitted by vigneshwari on Mon, 2011-07-04 05:53
»
- Login to post comments
RE: Unfortunately, gcluto is no
Unfortunately, gcluto is no longer supported. Have you tried using the command-line programs in Cluto?