c - Translational Limits on Enum Constants -
i have specific question translation limits of c (as defined in ansi/iso 9899:x standards family) regarding enumeration constants.
i have thousand individually indentifyable data sources, i'd enumerate. want respect minimal translational limits of c-standard, actual limits implementation defined, , exceeding undefined behavior (see is undefined behavior exceed translation limits , there checker tools find it?).
i know there translation limits on number of enumeration constants within same enum (c90: 127), number of identifiers specified within same block (c90: 127) , external identifiers within translation unit (c90: 511).
i think enumeration constants not have linkage (please correct me), , surely can place them out of block scope ... puts translation limit constraints following pattern (besides limits of integral types of target plattform, , of course number of constants within 1 single enum) - , if so, why?
myenumeration.h --------------- enum e1 { val11 = 0, val12, /* ... */ val_1n, end1 = val_1n }; enum e2 { val21 = end1, val22, /* ... */ val_2n, end2 = val_2n }; /* ... */ enum en { valn1 = endn_1, valn2, /* ... */ val_nn, endn = val_nn }; #define num_enum endn
note: switching #define won't help, there translation limits on defined marco identifiers (c90: 1024). forced #undef in complicated way, maybe complex #include pattern.
there no requirement compiler allow programmer define 511 different enum variables, each 127 different value names, each 31 characters. if names stored in absolutely optimal format, compiler still need 1.5 megabytes store those--not on compiler runs on machine 64k of total ram , 2 360k floppy drives [the source file defining names might lot less 64k if names generated using macro expansions]. note while such machine have been on small side in 1989, c commonly used on machines smaller, , authors of standard did not want forbid such implementations.
a compiler allow amount of storage identifiers, , abort compilation if program exceed limit (on systems don't limit memory usage individual programs, compiler should set limit high enough no realistic program hit it, low enough evilly-written source file won't able crash entire system. if compiler designed systems many megabytes or gigabytes of ram, limits suggested standard shouldn't factor. there limit, there's no way know unless 1 hits it.
Comments
Post a Comment