From: sachin on
Recently my MFC application is giving a strange error
System error code 1816
"Not enough quota is available to process this command."

one of the functionality of the program is hash ( MD5 and SHA1 ) the
incoming file , copy the file ,
hash ( MD5 and SHA1 ) it again , see if both hashes matches .

when i turn this feature off the error doesn't reproduce . i am hashing the
content .

Any idea why such issues comes ?
The memory foootprint of the program was 27 MB at the time of error ..
From: Michael Mol on
On Feb 4, 12:39 am, sachin <sac...(a)discussions.microsoft.com> wrote:
> Recently my MFC application is giving a strange error
> System error code 1816
> "Not enough quota is available to process this command."
>
> one of the functionality  of the program is hash ( MD5 and SHA1 )  the
> incoming file , copy the file ,
> hash   ( MD5 and SHA1 ) it again , see if both hashes matches .
>
> when i turn this feature off the error doesn't reproduce . i am hashing the
> content .
>
> Any idea why such issues comes ?
> The memory foootprint of the program was 27 MB at the time of error ..

Very likely, you're exceeding the disk quota of the user with your
file copy. It sounds like this is a data archival/backup feature.
Essentially, your target volume has run out of disk space the computer
(or network server) is willing to allot to your user. Suggested
solutions: Compress your data prior to storing it on the archival
target (You can use streaming decompression to check the integrity of
the data, or you can pass a copy of the data through your hash
function as you feed the compression algorithm for the archived
file.), periodically purge old archived data, reduce the amount of
data you're archiving, or get a bigger disk quota. Or all three.
From: Nathan Mates on
In article <94E43210-7F95-4ECA-8826-E7844F02C7A3(a)microsoft.com>,
=?Utf-8?B?c2FjaGlu?= <sachin(a)discussions.microsoft.com> wrote:
>Recently my MFC application is giving a strange error
>System error code 1816
>"Not enough quota is available to process this command."

Usual cause: out of memory, or out of memory to service the
allocation requested. I'd suspect you're getting this message when
trying to allocate memory to read in a large file all at once. Don't
do that. Read in small (1-4MB chunks), and process the chunks in
order.

On a standard 32-bit Windows environment, you can get about 75-80%
of the 2GB address space. But, the largest contiguous memory chunk you
can allocate is probably only in the 800MB range. If you really need
more, and streaming in chunks isn't doable, then go to a 64-bit OS and
application.

Nathan Mates
--
<*> Nathan Mates - personal webpage http://www.visi.com/~nathan/
# Programmer at Pandemic Studios -- http://www.pandemicstudios.com/
# NOT speaking for Pandemic Studios. "Care not what the neighbors
# think. What are the facts, and to how many decimal places?" -R.A. Heinlein
From: jjoohhnn on
Nathan,

> Read in small (1-4MB chunks), and process the chunks in order.

I would like to know about this chunk size while processing huge
data (from files). To get better(high like cpu, memory) performance on win
32 applications, which is the chunk size?

How can we identify the each chunk with unique value? But
performance should be very high.

what is the difference between fat & ntfs file systems? Is there any
other file systems are there?

If this is off topic, please guide me to correct forum.

Thanks in advance.

Regards,
John.


"Nathan Mates" <nathan(a)visi.com> wrote in message
news:9Y-dnUujZe0kiRfUnZ2dnUVZ_sbinZ2d(a)posted.visi...
> In article <94E43210-7F95-4ECA-8826-E7844F02C7A3(a)microsoft.com>,
> =?Utf-8?B?c2FjaGlu?= <sachin(a)discussions.microsoft.com> wrote:
>>Recently my MFC application is giving a strange error
>>System error code 1816
>>"Not enough quota is available to process this command."
>
> Usual cause: out of memory, or out of memory to service the
> allocation requested. I'd suspect you're getting this message when
> trying to allocate memory to read in a large file all at once. Don't
> do that. Read in small (1-4MB chunks), and process the chunks in
> order.
>
> On a standard 32-bit Windows environment, you can get about 75-80%
> of the 2GB address space. But, the largest contiguous memory chunk you
> can allocate is probably only in the 800MB range. If you really need
> more, and streaming in chunks isn't doable, then go to a 64-bit OS and
> application.
>
> Nathan Mates
> --
> <*> Nathan Mates - personal webpage http://www.visi.com/~nathan/
> # Programmer at Pandemic Studios -- http://www.pandemicstudios.com/
> # NOT speaking for Pandemic Studios. "Care not what the neighbors
> # think. What are the facts, and to how many decimal places?" -R.A.
> Heinlein


From: Nathan Mates on
In article <OGhWmY0hJHA.1172(a)TK2MSFTNGP04.phx.gbl>,
jjoohhnn <jjoohhnn(a)microsoft.discussions.com> wrote:
>> Read in small (1-4MB chunks), and process the chunks in order.

> I would like to know about this chunk size while processing huge
>data (from files). To get better(high like cpu, memory) performance on win
>32 applications, which is the chunk size?

The ideal/optimum chunk size (which you seem to be asking for)
probably depends on your HD, what other applications are running at
the same time, memory, etc. Write your code with a tuning value that
adjusts the chunk size. Then, experiment.

> How can we identify the each chunk with unique value? But
>performance should be very high.

I don't know what you mean by "unique value" here. You'll have to
be clearer. Much clearer. If you want performance, the best thing to
do is probably some form of overlapped I/O. Look at CreateFile --
http://msdn.microsoft.com/en-us/library/aa363858(VS.85).aspx ,
especially the FILE_FLAG_NO_BUFFERING & FILE_FLAG_SEQUENTIAL_SCAN
flags. (And, pay attention to all the comments as to the special work
you must do when using the first!) Then call ReadFile --
http://msdn.microsoft.com/en-us/library/aa365467(VS.85).aspx with an
overlapped handle. Going double-buffered where you request reads into
a second buffer when you're processing the first might help as well.

> what is the difference between fat & ntfs file systems? Is there any
>other file systems are there?

Get it running, first. Then make it fast. You've not managed to get
your code running for huge files. Caring about FAT vs NTFS is
irrelevant at this point. You're committing the error of premature
optimization, or premature "what if" nonsense. Working code that's not
stupidly slow is far better than thoughts on paper that care about
every detail under the sun.

Nathan Mates
--
<*> Nathan Mates - personal webpage http://www.visi.com/~nathan/
# Programmer at Pandemic Studios -- http://www.pandemicstudios.com/
# NOT speaking for Pandemic Studios. "Care not what the neighbors
# think. What are the facts, and to how many decimal places?" -R.A. Heinlein
 |  Next  |  Last
Pages: 1 2
Prev: Manifest question!
Next: IViewObject:Draw