At
this point, we could have selected data types and field names, had we not
created the specification in advance. The point at which we did our
alterations allowed us to save the specification, whereas this dialog serves better
as a place to do this on a "one-off" basis. Indeed, the Advanced
button here takes us back to the Import Specification dialog where we
accomplished our settings earlier.
The
fourth dialog of the Import Text Wizard might also serve as a means to "make
exceptions" in future imports by allowing us to make data type changes,
leave fields out altogether, and so forth, "on the fly" for the
duration of that single import action.
In any
event, we will not make alterations here, but can do a quick review of our
settings before pressing onward.
24.
Click Next.
25.
Select the No
Primary Key radio button on the fifth dialog of the Import Text Wizard
that next appears, as depicted in Illustration 12.
We
arrive at the sixth, and final, dialog of the Import Text Wizard. Here
we are given an opportunity to name the new table we are about to create, all
within the import process.
27.
Type the
following into the Import to Table box:
Event Log
The
final dialog of the Import Text Wizard appears as shown in Illustration
13.
Illustration 13: Final Dialog of the Import Text Wizard
While
we are importing only the Application log in our practice example, we
might consider combining all imported logs into a single table for more
integrated event analysis. This would involve modifications to the table to
make allowance for slight differences in log layouts, as well as to accommodate
and identify the different logs through the inclusion of a field that would
identify the source log (Application, Security, System,
etc.)
On the
other hand, we might have a reason for creating separate tables based upon the
source log involved. The needs analysis that we would perform prior to design
and creation of the database would dictate the considerations specific to our
environment.
The
log file is rapidly imported, and we next see a message box warning us of
errors. The message box also describes the presence of a second newly-created
table, where we can find further information about the import errors. The
message box appears as shown in Illustration 14.
Illustration 14: Message Box Warning of Import Errors
29.
Click OK
to close the message box and arrive at the database.
We see
that, in addition to our new Event Log table, an ImportErrors
table has been created (mine is called 022004_app_ImportErrors, as shown
in Illustration 15). The table name is created by MS Access, and is
composed of a composite of the original file name and the term "ImportErrors."
Illustration 15: The Two New Tables in the Database
The Errors
table reveals that the issues lie with the last field in the dump file. A
review of a few dump file rows, selected from those identified in the Errors
table as having issues, reveals that the primary problem is "nonparseability"
of the right-most field in the file, Details. This is due, in some
cases, to the fact that it is empty, although the quotation marks appear with
nothing between them; more commonly, the often lengthy Details field
contains characters beyond the final "closing" quotation marks, which
thus cannot be "seen" correctly by MS Access.
We
might work around this by writing a custom macro or script to handle this facet
of the data in subsequent imports, perhaps even in a multi-step approach.
Alternatively, we might not need the characters that we determine are almost
universally omitted in our current import process. If this is the case, the
results we have obtained will be sufficient to meet our needs.
Another
consideration might be to leave out the Details column entirely, and
thus eliminate the cause of the error, by simply telling the Import Wizard,
at the point of completing the Import Specification dialog, that we wish
to Skip the field entirely in the import. This could be accomplished by
placing a checkmark in the "Skip" checkbox of the Import
Specification dialog, as shown in Illustration 16.
Illustration 16: Skipping Import of an Unneeded Field in
the Dump File
Whatever our solution for the above, we have successfully
created an MS Access database containing our Event Log data at this
stage. I have used databases such as this as a data source for creating
reports (using Excel, MSSQL Server Reporting Services, Crystal, Cognos
Impromptu, MicroStrategy and many others), as well as for cube models in
Microsoft Analysis Services, Cognos PowerPlay, and others. The Event Log
data source can serve our analysis needs either standalone, as we leave it here,
or as part of a much larger analysis database, that contains tables that house
data from a host of other logs and sources. The concept of importing these
logs is the same; get them to a file format that is accessible to Access and
import them.
Automation can make the combination / recurring import of
the individual Event Logs an automatically recurring, scheduled
evolution. Moreover, the potential uses of the data that we might access this
way can be extended to include not only troubleshooting and review processes,
but also monitoring system performance within the context of load-balancing,
isolation of areas that might need software / hardware upgrades, and many other
procedures.
30.
Select File -->
Exit to leave MS Access as appropriate.
Conclusion ...
In this article, the second of a
two-part lesson, we continued the process we began in Part I, continuing our objective of creating and
loading an MS Access database with the data contained in the Windows Event
Log. We began with the import of the Application log that we
exported in Part I with the Elogdmp utility that we obtained from
the Windows 2000 Resource Kit. We established specifications for
handling the data types of various components in the dump file as we created
and populated an MS Access database in a multi-step process. We discussed the
use of the error table generated by MS Access as a part of the import
operation, and touched upon options for avoiding the errors we obtained.
Finally, we again discussed potential uses for the new Event Log
database, as well as the fact that the steps we took in our practice example
focused on concepts that can be automated for recurring, scheduled operations with
local customization.
»
See All Articles by Columnist William E. Pearson, III