Importing from other applications

ArcGIS software will read dbf files, comma delimited text files, Excel files and Microsoft Access tables.  I prefer working with .dbf files. If I have exported data from a statistical package to a .dbf file, I will often use Microsoft Excel to format the data the way I want it. Make sure the top row are field names acceptable to .dbf files, and make sure your number fields are numbers and text fields are text. If your data includes x- and y- coordinates you can add it to your dataframe using File, Add Data, Add XY data. You will need to specify the field names containing the coordinates and the projection the data is in.  When using latitudes and longitudes in decimal degrees, I usually make sure I have coordinates to 6 decimal places. If you want to do any analyses with the data, it is a good idea to export it as a spatial dataset (e.g. a shapefile, or geodatabase feature class)

If your dataset includes an ID field that corresponds to an ID field in an existing spatial dataset, you can simply join the .dbf file to the attribute table of the dataset. For census data I usually use a text field “STFID” that is the concatenation of the state, county, tract, block group and block codes to join attribute data to census polygons.

Working with comma delimited textfiles
ArcMap will allow you to import a comma delimited text (.csv) file for joining to a spatial dataset, geocoding, or adding as XY data. Unfortunately it will read any fields with numbers as numbers rather than text. This is a problem if you have fields with leading zeros such as ZIP codes or IDs. The trick is to edit the schema.ini file that resides in the same folder as your csv file as follows:
Col1=Attribute1 Text Width 10
Col2=Attribute2 Double
Col3= Attribute3 Text Width 32

Col1 refers to field 1, Attribute1 will display as the name of the field, Text Width 10 will create a string field 10 characters wide.
Thanks to Mark Hoyland for posting this on the ESRI Knowledge Base

Last modified 1/10/2013.