Home

Hazus Data Management International Workflow For

image

Contents

1. Domain eqDesignLevel Domain _egFdtnT upe PARA Domain fBldaTipe Cancel Domain_flDesigqnLevel Domain_flFdtaT ype Matra BldgT ype Matris DesignLevel Matra FotaT upe UDF EL ET e Inthe Mapping window select the Load button and navigate to Data_Management Models CBD Hazus_Updates Tools UDF_FL sav August 2014 Page 37 Hazus Data Management International Workflow Version 2 2 Data 3 0 OK button will remain disabled until Latitude and Longitude fields are mapped Field Mapping Source click to select T arget double click to azsign T MAME OCCUPANCY a ADDRESS Name T ADDRESS ZIPCODE CITY L Add STATE STATEA a CONTACT ZIPCODE PHOMENUMBER CONTACT COMMENT PHONENUMBER H TESREUILT YEARBUILT H ANUMSTORIES COST H BACKUPPOWER BACKUPPOWER HEBLOGCOST NUMSTORIES H CONTLOST AREA HABLOGARES CONTENTCOST HASHELTERCAPACITY _ SHELTERCAPACITY ZD el LTE Me Mapping Results A Delete Ciear l Load Save e Choose OK to finish importing User Defined Facilities into the Study Region inventory TASK 4 3 2 REVIEW UDFS IN HAZUS Review the UDFs in Hazus before proceeding e Inventory User Defined Facilities e Click on Map to display the UDFs August 2014 Page 38 Hazus Data Management Version 2 2 on Gov 19 RES 13 RES 14 02 gave 1E GOV2 17 MW Bldglype Concrete Concrete Concrete
2. TASK 3 1 PREPARE IMPROVEMENTS Note The workflow to create Improvements will vary from country to country and cannot be standardized The source databases to be used for Improvements need to be prepared for each Study Region Skeleton tasks are provided for reference purposes only Improvements is a point feature class that represents the structures within CBD to be modeled Improvements are unique to Newland and represent the merging of the best available sources into a common feature class The goal of Task 3 1 is to extract as much information as possible for each building record Attributes will be populated where they exist Attributes will be derived or defaulted where they do not exit The Task 3 1 schema tools workflow is customized for each project The Improvements schema tools workflow must also be customized for each region if the source data structure or content is not consistent within all Newland Regions Improvements are used to create Building Inventory Building Inventory is generic it is a defined schema that has been designed to work with Hazus and other modeling tools across all projects TASK 3 1 1 DOWNLOAD DATA SOURCES Building Footprints have been posted to the HIPOC FTP site Download the Newland GDB to the local Data_Sources folder e Download Footprints to Data_Sources Newland Inventory NL Region Buildings GDB mdb TASK 3 1 2 CLIP FOOTPRINTS TO CBD BOUNDARY Region1 building foo
3. Concrete Masonry or MH The resulting matrix will be large 33 x 5 Transformer Name huBldgType_Mapper Source Attribute hzQocCode kd New Attribute Name huBldoType Value Mappings Default Value wep 1 Populate UDF UDF ID from BI State Object ID The Object ID is formatted to six characters filled with leading 0 s StringConcatenator Parameters Transtormer o Transformer Name UDFIDConcatenator Destination Atribute UDFID Concatenated tems August 2014 Page 64 Hazus Data Management International Workflow Version 2 2 Data 3 0 Default the remaining UDF fields for which BI values do not exist a AttnbuteCreator Parameters Transformer Transformer Name AttributeCreator New Attributes Attribute ABldgDamageFnID 474 fiContDamageFNID 309 flnvDamageFniD 309 fFloodProtection 0 Contact lt Emergency Contact gt PhoneNumber xxx 200000 Comment hzBackupPower 0 hzShelterCapacity o huBldgSch Southeast_Coastal a os cancel Populate the related SQL tables using the UDF ID as the key field ND_Newland_UDF_SR mdb hzUserDefinedFlty flUserDefinedFlty huUserDefinedFlty Hazus21_Regions ND_Newland_FLHU UDS mdb hzUserDefinedFlty LL gl hzUser _ ADO August 2014 Page 65
4. Concrete Concrete Masonry Masonry Masonry Masonry Concrete Concrete Masonry Masonry Masonry Masonry Masonry blazoni Cost 03 17 566 53 75 92 8317 292 33 1 576 80 3 444 64 259 29 19 831 10 365 19 12 79 2 111 76 4 219 77 1 204 73 108 33 93 30 de PRA YearBuilt 2002 2013 2012 2012 2010 2001 1938 1986 1986 1983 1982 1982 1976 1974 1968 1978 1977 Area NumStores Desi 2 86 19 51 1 96 1 43 10 07 118 93 57 40 8 27 180 82 6 29 0 27 9 63 38 48 7 1 277 213 3 HR International Workflow Data 3 0 Make sure that the UDFs are correctly loaded into the Study Region The UDF count should match the flood prone BI count UDFs should not be outside the County study region boundaries The UDF locations should be the same as the BI locations The attributes populated in Models CBD Hazus_Updates User_Defined_Facilities ND CBD Hazus_Import_UDF mdb UDF FL 100 should be viewable in Hazus Inventory User Defined Facilities e Foundation Type values should be numbers 4 5 and 7 are the most common values not letters B C and S August 2014 Page 39 Hazus Data Management International Workflow Version 2 2 Data 3 0 TASK 5 HAZUS FLOOD ANALYSIS Losses are reported from the User Defined Facilities that have been generated from the Building Inventory Hazus provides several options for performing flood analysis based upon the available
5. Database Connections Add OLE DB Connection Add Spatial Database Connection LA Database Servers IS Servers ZA Interoperability Connections eet Scalar References A Search Results E Toolboxes Fo Tracking Connections e Select OLE DB Provider for SQL Server ES Data Link Properties Provider Connection Advanced All Select the data vou want to connect to OLE OB Provider s FtpRecordset OLE DE Provider Microsoft ISAM 1 1 OLE DB Provider Microsoft Jet 4 0 OLE OB Provider Microsoft Office 12 0 Access Database Engine OLE OB Pro Microsoft OLE DE Provider for Analysis Services 9 0 Microsoft OLE OB Provider for OTS Packages Microsoft OLE OB Provider for Internet Publishing Microsoft OLE DE Provider for ODBC Drivers Microsoft OLE DB Provider for OLAF Services 8 0 Microsoft OLE OB Provider for Oracle Microsoft OLE OB Provider for Search Microsoft OLE DE Provider for SQL Server Microsoft OLE DB Provider for Visual FoxPro Microsoft OLE DB Simple Provider MSD ataS hape OLE DE Provider for Microsoft Directory Services Z IT OF Cancel Help e Set Data Link Properties August 2014 International Workflow Data 3 0 Page 56 Hazus Data Management International Workflow Version 2 2 Data 3 0 E3 Data Link Properties Provider Connection Advanced All Specity the following to connect to SQL Server data 1 Select or enter server name lin polis 1 A
6. E MN BigStane New Database New Query H F oda E iy SC_Charlest Script Database as K Detach E J syHazus es El LA Security Policies El LA Replication o A E LA Management Start PowerShell Back Up Restore K Files Reports K Generate Scripts Rename Publish using Web Service Delete Import Data Refresh Export Bata Properties e Click OK E Shrink Database MN_BigStone_FLR_HH Select a page a os Sy Script Hel A General gt S L j The size of a database 18 reduced by collectively shrinking the database files releasing unused space To shrink individual database files use Shrink Files instead Database Database size Currently allocated space 101 63 MB Available free space 0 11 MB 0 Shrink action Reorganize files before releasing unused space Selecting this option may affect performance Masimum free space in files after shrinking ONE SQL USING SQL SCRIPTS SQL scripts have been written to manage Hazus Study Region MDFs without SQL Server Management Studio The following scripts compress the Study Region log files It is a good practice to run these scripts before creating an HND e Copy Shrink MDF bat and Shrink_MDF sql to c SQLCommands e Edit Shrink_MDF bat and enter the correct server name under the S qualifier August 2014 Page 60 Hazus Data Management Version 2 2 Shrink MDF bat Notepad File Edit Format wiew Help program Files Mi
7. e g Vehicles Utility and Transportation may be included Earthquake hazard models may be integrated Hurricane hazard models may be integrated Depth damage functions need to be developed outside of US construction types Customize the Hazus Study Region GUI by adding modifying the records in syHazus mdf StateName Singapore StatelD SG StateFlPs 99 This did not work in HIPOC 2 3 there are too many dependencies on the StatelD in supporting databases High rise buildings need special attention NumStories gt 9 are not supported in Hazus August 2014 Page 4 Hazus Data Management International Workflow Version 2 2 Data 3 0 Ts Multi use building codes need special attention Buildings used for RES1 upstairs and COM1 ground floor are commonplace and not currently supported If Hazus is to be used as a global tool certain design obstacles need to be overcome These items cannot be entertained without support from the Hazus developers 1 M POLE Hazus user guides and technical manuals are in English There are no descriptions about customizing the Hazus databases for international users Hazus is limited to US and feet Euros or Metres are not used anywhere The Hazus projection system is GCS NAD83 GCS NAD83 is not a global projection system International implementations will be performed in GCS WGS84 There are tricks to installing Hazus on a non US device Technical support is a
8. GDB e Do NOT map UDFs while attempting to import e Do not cancel the Import this is a one time shot Re starting the import tools will result in duplicate records e Only one UDF import session is allowed during a Hazus session The following error message appears if you attempt to import twice The Hazus application has locked up Task Manager out and re start Hazus PO TISHELLLib AzShortCutMenu ShowlmportDialog exception E Code 80030702 Code meaning Unknown error DxS8D0030702 Source CM Hazus Mh1ComponentsadtishelliHzShortCutmMeno cop Description Unable to open the table Method InmportDataFramSourcelntoTempTab tddtibal component MSSQL table heUserDefinedFity e After the import close the User Defined Facilities attribute menu to save changes Re open the menu and map the results to review the imported records e You can only delete UDFs one page at a time This can be slow if many records need to be deleted Use SQL Server Management Studio Appendix 2 to delete UDFs for loads that did not work DELETE FROM dbo_hzUserDefinedFlty DELETE FROM dbo_flUserDefinedFlty Copy lt Study Region gt UDS Copy mdb to UDS mdb to flush out the spatial records e lf you do not provide the fl values during the import the key records are created but all the values are lt Null gt e Try to add a UDF record manually if the imports continue to fail e There are no required attributes except for Latitude and Longitude if l
9. Hurricane amp Esrthquake AS TASK 1 PREPARE HAZUS DATABASES 1 Refresh Boundaries 2 Add State Boundary for Newland 3 Add County Boundary for CBD 4 Add Census Block and Tract Boundaries for CBD 5 Install Hazus databases for Newland Outputs syBoundary mdb ND bndrygbs mdb ND EF mdb UTIL mdb TRNS mdb HPLF mdb TASK 2 PREPARE CBD DATA SOURCES 1 Download source data from Newland FIP site 2 Copy Models Template folder 3 Prepare template documents Outputs Models CBD August 2014 Page 9 Hazus Data Management Version 2 2 TASK 3 BUILDING INVENTORY 1 Create Improvements from local data sources 2 Create Building Inventory from Improvements Outputs ND CBD BI GDB mdb TASK 4 UPDATE HAZUS INVENTORY 1 Create a Hazus ready UDF database from CBD Building Inventory 2 Create a Flood Study Region for CBD 3 Import UDFs into CBD Study Region Outputs ND_CBD_Hazus_Import_UDF mdb TASK 5 HAZUS FLOOD ANALYSIS 1 Import Flood Depth Grid 2 Create Flood Scenario 3 Run Flood Analysis 4 Export Results Outputs ND CBD FL Analysis GDB mdb ND CBD FL UDF hpr August 2014 International Workflow Data 3 0 Page 10 International Workflow Data 3 0 Hazus Data Management Version 2 2 TASK 1 PREPARE HAZUS DATABASES Hazus 2 1 statewide datasets will be updated before the local Study Regions can be made Demographics General Building Stock and Essential Facility databases will be refreshe
10. July 18 2012 Location To help services provide pou with local information such as news and weather select pour present location Cancel ENVIRONMENT e Compress the Study Region log files frequently Keep them below 1GB SQL compress scripts have been re written for SQL 2008 Win7 e Backup Study Regions frequently The Hazus processes take too long to risk losing the results Recommend that Duplicate the active Study Region daily Use the convention lt Study Region Name gt _Verxx Work in the most recent Study Region Delete Study Regions more than three versions old e SQL Server Management Studio is a must Used to re establish SQL Server instances Use Restore Database Used to compress the SQL logs which grow very large Use Compress Log Used to manage the UDF imports The Hazus process to copy a Study Region corrupted the master Study Region Not recommended PIO SQL Server Management Studio is a better option for reporting Currently the workflow is based upon Access ODBC connections to SQL Server which assumes that SQL Server manager is unavailable e Hazus fixit tools are a must FixSR runs to re establish lost Scenarios this is a lifesaver TBD FixSRBP does not work in Windows 7 the Study Region Names do not appear FLOOD MODELING e The DFirm hurricane boundary contains many records as many as a few thousand Run Dissolve to merge the 100 year polygons into as few as possible Retain all donuts The fewer
11. Meters Return Period to 100 August 2014 Page 40 Hazus Data Management International Workflow Version 2 2 Data 3 0 Depth Grid HEC RAS DEM Select depth grids Riverine DG00 cbd 2 Sm C Pro ects Hazus ProectAH azue Interni Depth grid parameters Retum period 100 optional OK Cancel TASK 5 2 CREATE FLOOD SCENARIO TASK 5 2 1 CREATE SCENARIO e Goto User Data to add the flood depth grid C HazusRegions_MR4 lt Study Region gt Quick quickdepth e Set Parameters Units Feet Return Period Optional 100 User Data Select depth grids Riverine Set Return Period 2 Browse LOL m Depth grid parameters Remove Uris CE gt Set Parameters Return period optional Cancel OK OK Cancel e Hazard Scenario New Name the new Scenario based upon the method used to set up the model lt Inventory gt _ lt Period gt _ lt ID gt where lt Inventory gt GBS Aggregate General Building Stock UDF User Defined Facilities August 2014 Page 41 Hazus Data Management International Workflow Version 2 2 Data 3 0 lt Period gt SRP 100 100yr single return period 28m 2 8 meter surge lt ID gt 1 Unique Study Case Number 1 9 Create New Scenario Enter unique name for the New Scenaro UDF_28m_1 Description FOG 2 8 meter surge model on UDFs for HIPOC ae Cancel
12. OJSTATS D Embedded Transformers Improvements geodb_point Run the script and review the log file to make sure that CBD_Buildings records were processed correctly Add CBD Improvements to the MXD and review the results e ifall OK save the log file to Models CBD Reports Logs ND CBD FME _ Improvements To Bl_ lt yymmdd gt txt Exit the FME workbench and save the changes to the ND CBD Improvements To BI tool August 2014 Page 29 Hazus Data Management International Workflow Version 2 2 Data 3 0 Sa EC EI numstories vearsuit BldgType iagCond 1978 Medium E Medium High High rason iedur High ai Medium High rat D CE G TE Bedi came bythe water JD 2087 2888 RESt e Cc es Ra aar ee foso ateasa ere Rali Condominium ST RESIDENTIALANSCELLANEOU Bk 12PmE 050 1058991 11607 esn esoe 9 on 1 1 L iag Te 1 a GOV P RG C H es 2007 2256 RESt 1873 masonry leur Ol rence touse jo Mon seo nest r LT High unciassifed eso 03 gov a an Medium Sanctuary Green e 474 40 ous ison High EE EE aaa asr est a tera mason ues Condominium Alk 1144 EE 31 3p TAE an 510 30004 Agd RFS r 1980 Bmasonr Medium E l mr mal rt SS 0 out of 2000 Selected TASK 3 2 2 BUILDING INVENTORY REPORTS e Add Data for datasets Data_Management Models CBD Analysis Bldg_Inventory Bl IN_CBD BI GDB mdb BI Data_
13. Schmitz OWNER Shane CO OWNER Enter names or email addresses TASK 2 1 2 DATA BACKUPS The Tools folder contains scripts that will be used to process the County datasets A bat script is provided to make incremental backups to Q drive e Rename From ND_County_Backups bat To ND_CBD_Backups bat e Open ND CBD Backups bat in Notepad and replace all occurrences From lt Region gt To active County name e g CBD e Run the BAT file at significant project milestones to back up work to the Q drive The script also creates directory listings of the current files and folders under Models CBD Reports Logs August 2014 Page 20 Hazus Data Management International Workflow Version 2 2 Data 3 0 TASK 2 1 3 DATA SOURCES Data Source folders have been created as a repository for the raw GIS databases provided by Newland and other GIS vendors The data sources are organized by data provider and date Data source folders contain the original data all data processing events occur in the modeling folders e Copy the source data from FTP to Hazus_International Data_Management Data_Sources lt Vendor_Name gt i lt yymmdd gt TASK 2 2 PREPARE MODELS Model folders are provided to house the local inventory hazard definitions and analysis results for CBD TASK 2 2 1 MODEL FOLDERS Model folders have been created for the data processing and modeling activities The models are derived from a standard Template
14. e Select the Flood Depth Grid In the New Scenario window use the Add to Selection button to drag a box around the desired flood depth grid Save the Selection and click OK to complete the Scenario set up PO RRB EE EPEE APE Etats eG 4 E Es L E L i 3 ER 4 1 i Ld B d LC C y i z A i S S A fe M a Ce BE A EG 4 l ide aap d Kaz ZA A A 0 4 N 2 Ke E S IA bed yf ES re hr C Ie VERE A A E dt AEF ha Gj Th AUTRE A fe g At y 479 Er 7 a why Y AZ gt ka E E h e Select map features to be included in the SC SR scenario single scenario may contain V ie lp R more than one abject type C a j M ap layer type River reaches Coastal shorelines FIT analysis areas User defined depth grids Map layer selection Add to selection _ Remove from selection Clear selection XxX Save selection El ky Ai Ni August 2014 Page 42 Hazus Data Management International Workflow Version 2 2 Data 3 0 e Hazard Riverine Delineate Floodplain Set the Analysis type to Single Return Period The Period to Analyze should state 100 Click OK TASK 5 3 RUN FLOOD ANALYSIS TASK 5 3 1 RUN ANALYSIS e Analysis Run e Select User Defined Facilities in the Analysis Options Window L 7 General Building Stock Damage and Loss 0 Essential Fa
15. testing may be needed All source data must be projected to GSC NAD83 before starting work This is the only projection system that Hazus and CDMS support One country at atime The ND qualifier for Newland is used to replace the statewide tables for North Dakota The Hazus StateFIPs is 38 The HIPOC Ver2 3 inventory is structured for flood projects Aggregate models do not run the inventory tables are empty Therefore Tract and Block boundaries are not needed to aggregate the inventory One Tract and one Block will be created using the same geometry as the County in order to create the Study Region Flood models may be run using imported UDFs The HIPOC Ver 2 2 workflow extends to creating the Study Region and running a flood analysis using a pre defined flood depth grid Workflows that describe other Hazus flood modeling options are project specific The provided World State and County boundaries are low resolution to save space and may need to be updated from higher quality local data sources Version 2 3 of the HIPOC has been extended beyond running a UDF flood event in Newland The functional model is improved i 2 The HIPOC Ver 2 3 workflow describes updating the Demographics General Building Stock GBS and Essential Facility EF inventories Multiple countries may be included for a regional solution Future versions of the HIPOC can be expanded on future international projects to include UE T Other databases
16. the polygons the faster the Hazus processing e The ESRI Dissolve routine requires at least 2GB RAM to dissolve the 1000 polygons within a DFIRM to a single polygon The Dissolve routine reports the problem but the only solution is more memory e Flood depth grid names are limited to 13 characters The folder structure cannot be deep move FDGs to Temp before importing them into Hazus August 2014 Page 51 Hazus Data Management International Workflow Version 2 2 Data 3 0 UDF FAQs The limitations of using aggregate data to model flood losses are known Increasingly UDFs are being used to model individual sites The buildings most at risk are clipped to the flood boundary and imported into Hazus to determine losses Alternatively UDFs may be used to model the following feature classes that are not currently supported in Hazus State owned buildings University campus buildings Flood prone buildings all occupancy classes The following questions have been submitted to the Hazus Help Desk to plan for future Hazus releases e UDF reporting is weak Results are not included in the Global Summary Report e tis not possible to generate annualized losses from UDFs e The UDF damage curves can be customized in Hazus 2 1 SP3 but the process is not documented BldgDamageFnID ContDamageFnID InvDamageFnID e The following fields are not used in the data model Foundation Type except 4 Basement Building Type Design Level Condition
17. this case set NumStories to 1 since only the first floor and below is of interest e Basement square footage is not included in Building Area Hazus uses total finished area sqft so if NumStories 3 and Area 6 000 then 2 000 per floor Sometimes we determine Area from building footprint geometry and building height where Area building footprint height 10 assumes 10 per floor Basements are captured in the according to Foundation Type There is a way to tell Hazus the size of the basement Very often we have partial basements i e half crawl half basement but still treat these as Basement Y If basement areas are provided use them to set Foundation Type to Basement August 2014 Page 52 Hazus Data Management International Workflow Version 2 2 Data 3 0 e Unless the user defines specific damage functions the User Defined analysis uses the GBS damage functions according to Specific Occupancy From the Flood User Manual The Flood Model will use the damage functions from the General Building Stock damage library The damage functions are associated by the Occupancy code and key fields such as Num of Stories and Foundation For example RES1 with 2 stories and a basement would be listed under R12B and a COM that is mid rise and has no basement would be listed under CIMN UDF IMPORTS e Before importing the UDFs Copy lt Study Region gt UDS mdb to UDS Copy mdb to backup the spatial
18. was documented but it is not for general consumption it is specific to Singapore using NUS provided data The workflow presented in HIPOC Version 2 2 is intended for users who are seeking a generic solution The project is built for a synthetic country named Newland ND equivalent to a Hazus State and a sub region named Central Business District CBD equivalent to a Hazus County The goals of HIPOC Version 2 2 are loosely defined as 1 The model must run in Newland That means we do not move Newland to USA run the model and then move Newland back while no one is looking 2 Replace the US data with Newland data without breaking the model That means we do not move US Census Blocks Tracts to Newland Instead we incorporate new Newland administrative boundaries 3 No Hazus US counties or states these tables need to be populated from the administrative boundaries defined in Item 2 4 The Study Region is empty The country region tracts and blocks are generated but the user must fill them from local data sources inventory and demographics 5 The focus is flood Flooding is one of the most common hazards around the world Coastal flooding is associated with a rise in sea level over and above normal tidal action The flood model supports the use of local flood boundaries or depth grids and Building Inventory that can be imported to international Study Regions HIPOC Version 2 2 is provided for reference purp
19. 0RAXK File wy Word Merge Rename Hide in this Group Delete View Properties Export the UDF Losses GenOcc_100 report as a PDF Models ND Analysis Flood Tables ND_CBD_UDF_Losses_GenOcc_100 pdf Occupancy Building Losses Buildings Damaged Commercial 87 901 854 Education 4 334 76 Government 15 003 348 Industrial 43 977 211 Religious 38 1 Residential 435 895 7 722 587 148 9 212 BACKUP STUDY REGION Backup the Study Region into a compressed HPR August 2014 Open Hazus GUI Export Region to Models ND HPR ND CBD FL_UDF hpr International Workflow Data 3 0 Page 46 Hazus Data Management International Workflow Version 2 2 Data 3 0 Appendix 1 Glossary The following terms and abbreviations are used throughout the workflow documentation Abbreviation Context Definition CDMS Abbreviation Comprehensive Data Management System D3 Abbreviation Data 3 0 Professional Services EF Abbreviation Essential Facilities GBS Abbreviation General Building Stock HIPOC Abbreviation Hazus International Proof of Concept UDF Abbreviation User Defined Facilities BI Feature Class Building Inventory Fl Feature Class Facility Inventory DEM Raster Digital Elevation Model 10m statewide Building Inventory Term Editing point GDB for Hazus GBS or UDF analysis Essential Facilities Term Hazus Care Fire EOC Police School facilities Facility Inventory Term Editing point GDB for Hazus EF or CF analysi
20. ATE BUILDING INVENTORY FROM IMPROVEMENTS Tools have been written to convert CBD Improvements into Building Inventory e Add the Newland CBD FME BI toolbox to ArcTools from Models CBD Tools ND_CBD_FME_BI tbx e Right click Edit the Improvements_To_BI tool to open up the FME workbench ArcToolbox 0 85 Editing Tools E D Geocoding Tools ET Geostatistical Analyst Tools El e Linear Referencing Tools ve RI H El LE Multidimension Tools Network Analyst Tools Newland CBD FME Building Inventory Server Tools Spatial Analyst SP atial Statistic Copy Delete Properties August 2014 Hazus Data Management International Workflow Version 2 2 Data 3 0 Set the input Published Parameters Source to Models CBD Analysis Inventory Improvements ND_CBD_Improvements_GDB mdb Set the output Published Parameters Destination to Models CBD Analysis Inventory Building_Inventory ND CBD BI GDB mdb gt Improvements To Bl File Edit Wiew Insert Readers Transformers Waters Tools Help AAA ELLOS AV ER ee HPO Fe Navigator X Ma E E ND_CBD_Improvements_GDB GEOD LJ h d ND_CBD_BI_GDB GEODATABASE_M H S Transformers MI Bookmarks able Published Parameters 38 Source ESRI Geodatabase MDE ES Al 2014 08 26 12 20 04 14 4 0 0 STATS Total Features Read 334 H Categorized 2014 08 26 12 20 04 14 4 0 0 STATS e e 2014 08 26 12 20 04 14 4 O OJSTATS 2014 08 26 12 20 04 14 4 O
21. Area e Ifthe units for UDF Cost and UDF ContentCost are in 1 x 1 000 then the reported losses will be in 1 x 1 000 If the units for UDF Cost and UDF ContentCost are in 1s then the reported losses will be in 1s The Hazus UDF Losses dialog window has in thou dollars in the title bar but the exported table headers show USD so you lose either way The general guidelines are If BldgCost ContCost and BldgArea values are imported in 1x1 000 then the losses will be reported in 1x1 000 If BldgCost ContCost and BldgArea values are imported in 1s then the losses will be reported in 1s If it is important that the GBS match the UDFs then import the UDFs in 1 x 1 000 For detailed analysis of buildings it may be better to leave the units as 1s e Values of 0 or lt Null gt do not work even though they may exist in Hazus Make sure that the following fields are populated correctly YearBuilt 1970 default if 0 or lt Null gt BldgType Wood default if lt Null gt e UDFs higher than 8 stories will not pass Flood analysis they will not run at all with or without damage functions This will need to be fixed in Hazus Can be patched by setting all stories GT 8 to 8 PIO Sample inventory up to 3 5m highest flood risk and develop damage curves around the statistical sample There will be two damage curves per specific occupancy one low range one high range Run the model twice to determine the range of risk In
22. Database Tools Acrobat pe A E H etne s e ess per EEJ t Ik w XML File hih saved Access Excel SharePoint saved Excel SharePoint Imports List gl More List Import ss ODBC Database eee Import or link to an ODBC 1 x e i y el 2 de ET EL bE rid Database such as SQL Server e Link to the data source by creating a linked table Navigate to the File Data Source created previously Select Data Source File Data Source Machine Data Source Look ir Tools a IN_Boone_FLA_EQL dsn BIN Template SQL Connect_2003 dsn DSN Name IN_E oone FLA_EG L Mew Select the file data source that describes the driver that you wish to connect to TOU can use any file data source that refers to an ODBC driver which is installed on your machine e The password is goHazusplus SOL Server Login Data Source IN Boone FLA_EQL den Use Trusted Connection Cancel Login ID hazuspuser Help Password free Options gt e Select the Study Region SQL tables to be linked Check the option to Save Password Link Tables Tables dbo absy_ BldginvOmgen dbo absy_BldgStructOmgFn dbo absy_BridgeClasses E dbo absy_BridgeDrnigan ance dbo absy_BridgeDrngFni dbo absy_BusFity Select All i Deselect All dbo absw CommtentralOfficesswitchingstrns dbo absw_CommcontrolvaultsAndControlStations dbo absw County AgriRegions Save password dbo abs_CRExposure Tr
23. DeOcupacion to Impro mM WENTENTS 2 Cartography Tools ial Conversion Tools mE Data Interoperability Tools mE Data Management Tools Ha Editina Lonis e Setthe input Published Parameters Source to Models CBD Analysis Working ND_CBD Working_GDB mdb e Setthe output Published Parameters Destination to Models CBD Analysis Inventory Improvements ND_CBD _Improvements _GDB mdb ES 1 ClaseDeOcupacion to Improvements File Edit View Insert Readers Transformers Writers Tools aa Navigator N Main H E PR_Carolina_Working_GDB GEODATABASE_MDE ics E PS PR_Carolina_Improvements_GDB GEODATABASE_MDB E y Transformers Source ESRI Geodatabase MDB File s C Prop E Destination ESRI Geodatabase MDB File cry Edit Value E Workspace Resources Edit Definition SE Tool Parameters 188 Tool Name lt not set gt Lo Workspace Description lt not set gt ABE Destination Redirect No Redirect gr Advanced h Workspace Search E E All pa J Categorized IG Embedded Transformers August 2014 Page 25 Hazus Data Management International Workflow Version 2 2 Data 3 0 e Run the script and review the log file to make sure that CBD_Buildings records were processed correctly e Add CBD Improvements to the MXD and review the results e ifall OK save the log file to Models CBD Reports Logs ND_CBD_FME_Buildings_To_Improvements_ l
24. Each template contains the folder structures and tools used to prepare the source data and model data for each County The Template contains the knowledge base for the project it is updated on the QA drive as processes are improved e Copy the source data template from QAND HIPOC Data_Management Data_Sources Models Template to Hazus_International Data_Management Data_Sources Models CBD e Copy the modeling template from Q ND HIPOC Data_Management Models Models Template to Hazus_International Data_Management Models Models CBD TASK 2 2 2 TEMPLATES Template documents need to be setup for each County Rename all templates and change the file properties Modify the contents to reflect the active model CBD e Rename From ND_County_ To ND CBD e Update the File Properties on all CBD documents Subject CBD Author lt Enter your name here gt Comments 2014 Pilot Category HIPOC Newland Company D3 Newland e Open each document in Word and replace all occurrences From lt Region gt To The active County name e g CBD TASK 2 2 3 WORKING FOLDERS Working folders and GDBs are provided as temporary data stores to process intermediate feature classes Working folders or files are temporary they may be removed after the Hazus inventory is updated Models CBD Analysis Working ND_CBD Working_GDB mdb August 2014 Page 21 Hazus Data Management International Workflow Version 2 2 Data 3 0 Mod
25. Flood Data ND CBD EL Analysis GDB mdb e Runthe macro named Losses_Reports_ Maker Home Create External Data Database Tools a EA AE Sice o Eo CDTi Bs AMS E View Paste Filter a a Find Sizeto Switch Text 7 29 a gt ES 7 Fit Form Windows Formatting Views Clipboard m Sort amp Filter Records i Window All Access Objects Ts a UDF_Losses 100 o E amp Sona OccupancyC BldgDmgP BldglossUSD 14 P Taa RES1 17 740134 37 0050325173 E UDF_Losses_100_SHAPE Index RES1 22 649611 14 58272554624 EX UDF Losses GenOce 100 RES3C 27 811024 160 49769761424 Reports A RES3B 26 779534 30 35219163094 EM UDF Losses GenOcc 100 icon 11 265093 36 50644893231 COM1 13 356955 41 74863211755 Macros 2 Losses Reports Maker 18 929124 23 72500685664 y ca Es dt Losses Reports Maker B Design View S T K pu MECA Rename Object Properties e Runthe macro named Losses Reports Maker August 2014 Page 45 Hazus Data Management Version 2 2 TASK 5 4 3 BE fo Y gt croso 5 AL Lost de Y Home Create External Data Database Tools Di lie ALENE Paste J Filter LV ke Ker Saot Te J d a Sort amp Filter 4 ra xe Selections UDF_Losses_ 100 UDF_Losses_ 100 SHAPE Index UDF_Losses_GenOcc_100 Open Layout View Design View SharePoint List Wp Word RTF File Z PDForxPS ab
26. Hazus Data Management International Workflow For Newland ND August 2014 Developed By Data 3 0 Professional Services data30 com LD Data 3 0 Professional Services Hazus Data Management International Workflow Version 2 2 Data 3 0 Table of Contents PHOISCE OVO IVIOW meteisa eeo a a aaan eor En 3 Design E Tle a TT 4 File MandgemenN eisene necia cias 6 DOCUMENT Managemen builder 8 AO 9 Task Ie Prepare Hazas Daiana el 11 Task 1 1 Prepare Hazus Boundaries ui ss 11 Task 2S Prepare BD Data SOURCES sitial 20 Task 2 1 Prepare Data SOUlC OS san 20 Task 2 2 Pr par eie e Litas ios 21 Task 23 Install Haz s Databases veni ne Ari anna inter tan 22 FASKS Building AV SIMO Visionado tacita 23 Task 3 1 Prepare Improvements iii inserer 24 Task 3 2 Prepare Building Inventory is 27 Task 4 Update Hazus Inventory iii 34 Task 4 1 Create User Defined Facilities se 34 Taste Create Study Red Nit cid 36 Task 4 3 Import User Defined facilities 36 Task sHazus RIGO Amaly SIS dee 40 Task S 1 Import Flood Depth GRO TT 40 Taste 2 Crede Food ocn Odia 41 RZ eH RUM FlO0d Anal SS cuina dades 43 Task o 4 EXPRES US dad 44 ADDENAIX I eer St id e 47 Appendix 2 ISSUES aNd QUESTIONS isa 48 Appendix 3 Hazus Hints eee ee eee eee 49 Appendix 4 SQL Server Hints sees 55 Appendix 5 FME Algorithms For Building Inventory eee eee 62 Appendix 6 FME Algorithms User Defined Facilities 4 63 A
27. Management Models CBD Analysis Flood Hazus IN CBD EL Analysis GDB mdb DFirm_100 e ArcTools Analysis Tools Overlay Clip to determine the flood prone buildings Save the output feature to Data_Management Models CBD Analysis Inventory BI IN CBD BI GDB mdb BI ED 100 August 2014 Page 30 Hazus Data Management International Workflow Version 2 2 Data 3 0 e Close ArcMap Access scripts have been written to report the structures by general occupancy Reports may be created for all BI within CBD or just the BI within the flood boundary e Open the Building Inventory database in Access Models CBD Analysis Inventory Building_Inventory IN CBD_Bl_GDB mdb e Right Click Run the macro named BI_ Reports Maker August 2014 Page 31 Hazus Data Management International Workflow Version 2 2 Data 3 0 se er i L P NK ow i Le IN nlm 2 y H E 7 d a a Li Ue E L Home Create External Data Database Tools B Tic EE z A E g A 5 Filter S eE x A E S SL es v Ear inboard 5 sorta Fiter A Matrix hzOccCode E SelectedObjects al ter Start Bl Reporter Completed EN Selections a BL By Occupancy o E AE EN ELFP 100 By Occupancy 2 BLReports_Maker Design View en El Reports_Maker Export Rename Object Properties e Export the report named Bl By Occupancy to CBD Hazus_Updates Tables ND CBD BI By Occupancy pdf Layout View Design View Rename HK SharePoint List Hide in
28. OUNDARIES TO TRACT BOUNDARIES Tract boundaries are stored in syBoundary Tract boundaries will be generated from the regional County boundaries one Tract per County e Open Models ND MXD_Documents ND_Hazus_Boundaries mxd e Edit the tool named 2 syBoundary syTract Set the Destination Geodatabase to the syBoundary mdb to be updated Run the tool August 2014 Page 13 Hazus Data Management Version 2 2 Ez 2 syBoundary_syTract Fite Edit Yi sert Readers Transformers Waters Tools Help TS D ES LEONG Navigator x I E E syBoundary GEODATABASE FH syBoundary GEODATABASE a Published Parameters L Destination ESRI Geodat k Source Geodatabas Ela Tool Parameters r Tool Name lt not set gt i r Workspace Description r Destination Redirect No Workspace Search Gu a 218 Al gt 2014 08 20 12 05 27 Galea Gia 2014 08 20 12 05 27 S Kadan 2014 08 20 12 05 27 betises a eer 2014 08 20 12 05 27 2 2014 08 20 12 05 27 SIG Recent 12014 08 20 12 05 27 O O STATS Total Features Written International Workflow Data 3 0 l 2014 08 20 12 05 27 2 1 O OJSTATS Total Features Read Transformer Gallery 2014 08 20 12 05 27 OOISTATE 36 2014 08 20 12 05 27 O O STATS 2014 08 20 12 05 27 O OINFORMITranslation was SUCCESSFUL with 1 warnina s 36 feature sW829 coor L pia Log Transformer Description e The tool
29. Steps to create a Study Region and perform flood modeling are documented in the CBD Risk Assessment Workflow e Open Hazus e Create a Flood Study Region for CBD Newland Name ND_CBD_FL_UDF Description HIPOC Flood analysis using updated GBS and UDF Create New Region Study Region Name Each study region needs to identified with a unique name Enter below a name which uniquely identifies your region The name can be up to 50 characters long ND CBD EL UDF Region description optional Newland HIPOC Flood Model using UDFal lt Back Next gt Cancel TASK 4 3 IMPORT USER DEFINED FACILITIES TASK 4 3 1 IMPORT UDFs To HAZUS Note Steps to import UDFs into Hazus are documented in more detail in the Hazus Flood User Manual e Open Hazus e Open the Flood Study Region ND_CBD_FL_UDF e Inventory User Defined Facilities e Right click in the open area of the User Defined Facilities window and select Import August 2014 Page 36 Hazus Data Management International Workflow Version 2 2 Data 3 0 5 User Defined Facilities Sax User Defined Bldg T ype Cost fearBuilt Area Num t Add New Record Delete Selected Records 1 Import Export Data Dictionary Meta Data e Select the CBD UDF database Data_Management Models CBD Hazus_Updates User_Defined_Facilities ND _CBD_Hazus_Import_UDF mdb e Select the table UDF FL from the Table List and click OK Table Name List Domain eqbldgT ype DK
30. ansp dbo absv CPExposurelltil August 2014 Page 58 Hazus Data Management International Workflow Version 2 2 Data 3 0 SQL USING SQL SERVER MANAGEMENT STUDIO SQL Server Management Studio is the preferred environment for managing multiple Hazus PCs e Open SQL Server Management Studio e Connect to a Server usually the local PC using SQL Server Authentication K Connect to Server 2 SOL Server 2008 Server type Database Engine Server name Authentication SQL Server Authentication Login hazuspuser Password Remember password e Right click Databases and Attach the Study Region databases residing on the Server usually the local PC E Microsoft SUL Server Management Studio File Edit View Tools Window Community Help Li NewQuers Ly Lt Gj Ad A Object Explorer Connect 3 Yu re Lig IN POLIS 17HAZLISPLUSSRVR SQL Server 9 0 4035 H El Ca f tabag gt Mew Database E mN Restore Database E C Restore Files and Filegroups a la sc E sc Start PowerShell El svHl E La re Reports CA Server CA Replicat RPSN H CA Management in e Right click the SQL database that needs to be compressed e Select Tasks Shrink Database August 2014 Page 59 Hazus Data Management International Workflow Version 2 2 Data 3 0 Object Explorer TRE lt LA IN POLIS 1PHAZUSPLUSSRYR SQL Server 9 0 4035 F a LA Databases E LA System Databases fitmpoe
31. b 8 19 2014 11 41 AM Microsoft Ac 1 388 KB lal MSH mdb 6 20 2014 4 41 PM Microsoft Ac 828 KB ZI TRN rmdp 6 19 2014 11 43 AM Microsoft Ac 2 068 KB 21 UTIL mdb 8 19 2014 11 43 AM Microsoft Ac 1 624 KE di IN ji ND J OH gt L PR D sc N di TX L 8 items August 2014 Page 22 Hazus Data Management International Workflow Version 2 2 Data 3 0 TASK 3 BUILDING INVENTORY Building Inventory is considered to be the most current and accurate database of the structures to be modeled by Hazus It is often created by linking the parcel centroids with tax assessor improvement records For the HIPOC Building Inventory is generated from building footprints The workflow to create Building Inventory starts with building footprints which is maintained by Newland Not all desired fields have been populated typical so missing values will be derived or defaulted Improvements represent the foundation feature class for the creation of Building Inventory It is created by populating values with best available data c K S G G ka 7 K Wa e NQ Ne E K gt bi f f En E Y UN N N N L de LAN i gt K a H F y a x AS d K DEA J ye ba lt a Z gt K C ba A F Pd F Sm E 4 C xy Y C Ka NS j S d K c Y via S vw r L z E N DA gt L L s K X T _ w i 7 l Ag Bldg Tolocc St Bida Ftype Address NumBasemen BldgType Bldg Name ey T EXISTING Comm
32. cilities 0 Agricultural Products Select All Q Vehicles 0O Direct Social Loss Deselect All 0 Indirect Economic Loss 0 what lf August 2014 Page 43 Hazus Data Management International Workflow Version 2 2 Data 3 0 TASK 5 4 EXPORT RESULTS TASK 5 4 1 EXPORT MAPS UDF analysis results are unavailable in the Hazus Global Summary Report Instead UDF losses are exported to a GDB and reported outside of Hazus e Hazard Study Case Open UDF 28m 1 e Export the flood boundaries to Models ND Analysis Flood Data ND CBD FL Analysis GDB mdb FL_Bnary_100 e Results User Defined Facility e Select the BldgLossUSD column and click Map e User Defined Facilities Loss y Results hor Scenario UDE 28m 1 _ OecupancyClass BldgDmgPct PldglossUSD A Sov 450f E RES4 0 00 0 00 0 00 0 00 0 00 0 00 0 00 0 00 0 02 2 07 0 00 0 00 e Right click the UserDefinedFlty feature class and Export to Models ND Analysis Flood Data ND CBD FL Analysis GDB mdb UDF Losses 100 e Export the Hazus map as a PDF Models ND Analysis Flood Maps ND CBD UDF Losses 100 pdf e Exit Hazus August 2014 Page 44 Hazus Data Management International Workflow Version 2 2 Data 3 0 Building Losses US 6 0 2500 2501 5000 500 750 D 7 501 10 000 Flood Depth Feet Low 0 106401 TASK 5 4 2 EXPORT TABLES The UDF loss reports will be created in Access e Open Access to Models ND Analysis
33. crosoft SQL polsiBinnosql exe 5 T o SE HAFUSPLUSSRYR U ha zus International Workflow Data 3 0 Alea user F OL i Ci soLcCommandsisparink_mMDF sql o g azus piu a cat Ha AAE NRE rpt pause e Edit Shrink_MDB sql and enter the name of the Study Region MDF database and log file that need to be compressed P E Shrink_MDF sql Notepad File Edit Format View Help use sc Beaufort FLAC ann BACKUP LOG JSC _BeaulTort_FlLRC_ANN WITH TRUNCATE ONLY GO GO e Run Shrink_MDF bat e Review the results in Shrink_MDF rpt E Shrink _MDF rpt Motepad File Edit Format View Help DECC SHRINKFILEQSC_Beaufort_FLRC_ANN_log 207 Lx 2 gt 3 gt 4 gt 1 gt 2 gt 3 gt 4 gt DbId Filed currentsize Minimumsize Us edPaqes lL row affected EstimatedPages DBCC execution completed If DBCC printed error messages contact your system administrator 1 gt August 2014 Page 61 Hazus Data Management International Workflow Version 2 2 Data 3 0 Appendix 5 FME Algorithms For Building Inventory The following filters and mapping schemes were applied to create Building Inventory for Newland The hz and fl fields are specifically built for Hazus hzBldgArea hzBldgCost and hzContCost are in 1 000s The unit for BldgValue is s The unit for BldgArea is saft BUILDINGS TO BUILDING INVENTORY Populate BI Occupancy Code from Improvements Category Records that don t match will be defaul
34. d all records deleted in Task 1 Boundary records for Newland will be imported Inventory records for CBD will be imported TASK 1 1 PREPARE HAZUS BOUNDARIES TASK 1 1 1 REPLACE HAZUS BOUNDARIES WITH WORLDWIDE BOUNDARIES Pre populated Hazus databases that can be used world wide are provided in World International State and County boundaries have been added but the inventory is blank Blank Hazus databases are provided in Blank The desired records from World are exported to Blank The user must populate the empty Block and Tract records e Copy Blank mdb to ND KS lel Hazus Updates Statewide ND A Search NID Organize Include in library Share with Burn New folder de Hazus International E Name Date modified de Data_Management de Backups de Data_5ources de Newland 1 164 KB 1 508 KB 960 KB Microsoft Ac 240 KB lal bndrygbs mdb Microsoft Ac EF mdb 21 FIAG mdb 8 19 2014 11 42 AM 8 19 2014 11 40 AM 8 19 2014 11 40 AM Microsoft Ac Microsoft Ac de Hazus_Updates ND dy NL gt Statewide Blank Z fVeh mdb Z HPLF mdb MSH mdb Elf syBoundary mdb 21 TRN mdb 8 18 2014 3 15 PM 8 19 2014 11 41 AM 8 19 2014 11 41 AM 8 19 2014 11 42 AM 8 19 2014 11 43 AM Microsoft Ac 1 388 KB Microsoft Ac 1 1 376 KE 2 068 KB Microsoft Ac Microsoft Ac al UTIL mdb 8 19 2014 11 43 AM Microsoft Ac 1 624 KB L ND j 9 tems Note For users wantin
35. ders Transformers Writers Toner T de Split Complex Edges no Si i fe Advanced 4 Feature Types H 5 syBoundary GEODATABASE_MD August 2014 Page 12 Hazus Data Management International Workflow Version 2 2 Data 3 0 e The tool will copy selected syState and related syCounty records from World to ND In the case of IHPOC the Newland State and County boundaries will be exported and following attributes populated syState StateFips 38 syState HUState 0 Hurricane model not supported in HIPOC syState NumCounties nnn the record count in syCounty where State ND syCounty CountyFips StateFips amp nnn 3 8 8 0 amp JC ER ER syCounty Lan A A a ap North West E Central Singapore 4 1 hi B 1 out of 6 Selected syCounty A Sm PE e Save the FME log file to Hazus_International Data_Management Models ND Reports Logs ND FME syBoundary syCounty 140819 txt Note Removing unwanted countries and regions from syBoundary mdb is optional However the databases will be more efficient and the Study Region creation process streamlined if the Hazus tables are restricted to the Study Area limits Note Multiple countries can be selected if a regional solution is needed Note For users wanting to write their own ETL tools modify syBoundary mdb to remove all records outside of the Study Area TASK 1 1 3 RELATE COUNTY B
36. e the required permissions then your install is toast There is no uninstall SP2 option You cannot back out Hazus must be uninstalled re installed and then SP1 SP2 etc e Run Hazus service packs in sequence i e assume that SP2 requires SP1 Hazus is not consistent about this sometimes service packs include previous packs other times not Do not take the chance of missing a service pack i e SP3 installed but not SP1 e The CDMS error Microsoft Jet OLEDB 4 0 provider is not registered is not related to Hazus SP1 or SP2 e The solution for most Hazus users is a dedicated Hazus PC Once setup and working they are never touched PCs are upgraded between projects never during a project e lf you can t have a dedicated Hazus PC then you can setup a dual boot virtual drive one for Hazus and another for everyone else Boot to VHD Virtual Hard Drive is provided with Microsoft Windows It allows a PC to be partitioned into multiple configurations The Hazus configuration is a user selectable option at start up And there are added complications on international PCs e You may be able to fix Study Regions that crash ty Shell Application VG Shell Application has stopped working Windows can check online for a solution to the problem Check online for a solution and close the program Close the program Debug the program A Hide problem details Problem signature P
37. egion is always on the local machine Use REGEDIT to determine the file paths FEMA Hazus GENERAL uid Hazuspuser pwd goHazusplus 01 server_name in polis 17 Hazusplussrvr Open Access to a new database New Project Using Existing Data Use a standard convention for a Connection Name lt State gt _Newland_SR adp Create 1 Server Name in polis 17 Hazusplussrvr 2 Use Specific Server Name and Password User Name Hazuspuser Password goHazusplus 3 Allow Password Saving 4 Select database Connect either to the Study Region MDF 5 Test Connection 6 Save Password SQL DATABASE CONNECTION USING ACCESS 2007 Open Access to a new database External Data More ODBC Database Link to the data source by creating a linked table Select Data Source File Data Source New Create A New Data Source Driver SQL Native Client Name File Data Source lt State gt Newland SR SQL Server Authentication User Name Hazuspuser Password goHazusplus_01 Change to default database to the Study Region of interest lt State gt _Newland_ lt Model gt mdf Rename the linked Access database saved as My Documents DatabaseX accdb to lt State gt _Newland_SR accdb SQL DATABASE CONNECTION USING ARCCATALOG August 2014 Page 55 Hazus Data Management Version 2 2 e Open ArcCatalog e Database Connections e Add OLE DB Connection S Catalog Cci Fii E ArcWeb Services Lal Coordinate Systems B lt
38. els CBD Hazus_Updates Working ND CBD Working GDB mdb TASK 2 3 INSTALL HAZUS DATABASES Any changes made to the Hazus statewide databases during the course of this project will be version controlled Details of the changes will be documented Models CBD Reports Logs ND CBD Hazus Updates lt yymm gt doc ND CBD Hazus Updates lt yymm gt xls Changes to the Hazus Study Region databases are not version controlled The most current Hazus databases must be installed on all local PCs where Hazus modeling will be performed TASK 2 3 1 REPLACE DEFAULT DATABASES The Hazus default General Building Stock specific database must be replaced with the updated database provided by Newland before creating the Study Region e Updated Hazus databases for Newland are provided in Hazus_Updates Statewide ND e Copy the updated boundary database from Hazus_Updates Statewide ND syBoundary mdb to C HazusData 21 syBoundary mdb e Copy the remaining inventory databases from Hazus_Updates Statewide ND mdb to C HazusData_21 ND mdb Gwe die OS C HazusData 21 ND Organize Include in library Share with Burn New folder Ei de HazusData_21 Mame 2 Date modified Type 7 d GA TE de 11 Eh bndrygbs mdb 8 20 2014 4 50 DM Microsoft Ac 1 464 KE lal EF mdb 3 19 2014 11 40 AM Microsoft Ac 1 508 KE a flAG mdb 8 19 2014 11 40 AM Microsoft Ac 960 KB lal fiVeh mdb 6 18 2014 3 15 PM Microsoft Ac 240 KE EN HPLF md
39. ercial BK4QUEENS 1 RMASONRY RESDENTIALICOMMERCI 441 EXISTING Commercial BIk3 QUEEN S NRY RESIDENTIAL COMMERCI 441 EXISTING Commercial BIk2 QUEEN S NRY RESIDENTIAL COMMERCI NRY E O a exIsTNG Commercial Bk 1 QUEENS sce RMASO RMASO 0 RMASO E E 0 EXISTING MSCP ik 28 558 EXISTING Residential Bik 115 BUKIT RMASO RESIDENTIAL 2371 071189 se ens zam L IE 28 558 EXISTING Residential Bik 113 BUKIT RESIDENTIAL 1863 078692 N as ess EXISTING Residential BK111 BUKT 0 RMASONRY 28 28 558 EXISTING Residential Bik 109 BUKIT __0 RMASONRY RESIDENTIAL mi E eee SS 0 out of 2000 Selected Central_Facilities The high level workflow to create Building Inventory is described below Task 3 1 Prepare Improvements Run the FME script called Cental Facilities 2 Improvements Creates Improvements at the centroid of each building footprint Joins Permits to Building Points to populate blank Hazus fields Calculates Building Area and NumStories Calculates Building Cost from Building Area Converts Cali to Hazus Occupancy Code Project Improvements to GCS_NAD83 Task 3 2 Migrate Building Inventory August 2014 Page 23 Hazus Data Management International Workflow Version 2 2 Data 3 0 Run the FME script called Improvements 2 Bl Creates Building Inventory from Improvements Converts all values to Hazus domains
40. g to create their own World statewide tables copy the default Hazus statewide tables for a representative state and delete the records from each database In this case users will need to add their own State and County boundaries TASK 1 1 2 RELATE STATE BOUNDARIES TO COUNTY BOUNDARIES Not all pre populated countries 226 and regions 3 203 will be needed The countries of interest will be identified and the associated interior regions will be related For the IHPOC the Study Area will be Singapore the corresponding State will be Newland ND e Open Models ND MXD_Documents ND_CBD_Hazus_Boundaries mxd August 2014 Page 11 Hazus Data Management International Workflow Version 2 2 Data 3 0 Qn Region Hazus Bc undaries m Map Arcinfo File Edit View Bookmarks Insert Selection Geoprocessing Customize Windows Help D L Fe x d 1101 975640 RAPEROS La HRANO L KS O A Anal 10 Hazus World Boundaries amp O Newland Boundaries E Hazus NL Boundaries O hzCensusBlock CI O hzTract oO O hzCounty O syTract E a O syCounty 191 691 30 708 Decimal Degrees e Add the ND CBD Hazus Boundaries tbx toolkit from Hazus_Updates ND Tools e Edit the tool named 1_syBoundary_syCounty Modify the WHERE Clause to the country or countries to be modelled Set the Destination Geodatabase to the syBoundary mdb to be updated Run the tool E 1 syBoundary_syCounty File Edit Visimalazert Rea
41. hazuespluzsrer Refresh 2 Enter information to log on to the server f Use Windows NT Integrated security f Use a specific username and password User name hazuspuser SADE SSSA AS SES E Password Blank password Allow saving password 3 5 Select the database on the server eee le Attach a database file as a database name Using the filename C Program FilesSCOMS SCOMS mdf El UK Cancel Help SQL DATABASE CONNECTION USING FILE DATA SOURCE e A template file data source is available in ND Tools named lt State gt _Template_ SQL Connect_2007 dsn e Rename the template file data source from lt State gt _Template_ SQL Connect_2007 dsn to lt Study_ Region Name gt dsn e Open the lt Study Region Name gt dsn in Notepad e Replace the DATABASE and SERVER variables to match the Study Region to be linked Save the changes to the File Data Source E IN_Boone_FLR_FOL dsn Notepad M 11674 File Edit Format View Help Hee 2 Se P DRIVER S0L Native Client UID hazuspuser DATABASE ThM_ Boone FLR_EOL WSID IN POLIS 1 APP 200 Microsoft office system SERVER IN POLIS L HAZUSPLUSSRVR Description IN_Boone_FLR_EGL e Open Access and link to the Study Region SQL tables by External Data More ODBC Database August 2014 Page 57 Hazus Data Management International Workflow Version 2 2 Data 3 0 Ex Databasel Dai c Home Create External Data
42. is against the Study Region Tools and workflows were developed to update Hazus databases for the HIPOC The tools and workflows may be applied to Regions outside of CBD but they will need to be customized to the local data sources Task 4 provides the steps needed to import Building Inventory into Hazus as User Defined Facilities This inventory is generally used for point based detail flood loss analysis Task 4 Import User Defined Facilities Use the FME script called BI To UDF to create User Defined Facilities Create a Hazus Flood Study Region Import UDFs into the Hazus Study Region and test the results TASK 4 1 CREATE USER DEFINED FACILITIES The Building Inventory will be imported into the Hazus Study Region as User Defined Facilities for point analysis of detailed geographic areas Each UDF point represents a BI point at risk to earthquake flood or wind losses User Defined Facilities are not supported in CDMS User Defined Facilities will be imported into the Study Region using Hazus The UDFs must be re imported to each Study Region to analyze losses to individual structures typically flood models or other small detailed geographies Buildings with potential losses from flood hazards will be translated to User Defined Facilities Blis a feature class from which Access UDF tables formatted for Hazus flood modeling will be created Only UDFs within the flood boundary will be imported Task 3 Note The steps to impor
43. lished Farameters E E hiha iG Project S a Tool Parameters TEE Tool Name lt notsej 0 Workspace Descripti er n IMM F Destination tete Destination Microsoft Access Database File lities ND_CBD_Hazus_Import_UDF mdb a 0 55 Advanced ba fh Workspace Search nm alL Transformer Gallery Y Al 2014 08 26 15 56 ra 17 a D OISTATS JUDE EQ ES T Categorized 2014 08 26 15 56 09 17 0 0 0 STATS JUDF_ Exposure E Embedded Transformers 2014 08 26 15 56 09 17 0 0 0 STATS JUDF_FL E E Recent 2014 08 26 15 56 09 17 0 O OJSTATS GE Search Results 2014 08 26 15 56 09 17 ol 0 OISTATS Total Features Written 2014 08 26 45 56 09 17 ol 0 OlINFORM Translation was SUCCESSFUL with 3 warnina s 13901 feature 4 ul Log Transformer Description e Run the script and review the log file to make sure all records were processed All Bl_FP_100 records should be migrated as UDFs e Save the log file to Models CBD Reports Logs ND CBD BI To UDF FL lt yymmdd gt txt August 2014 Page 35 Hazus Data Management International Workflow Version 2 2 Data 3 0 e Save the changes to the FME script and exit ArcGIS The populated Access UDF tables can now be imported into Hazus The algorithms used in the BI 2 UDF FME script are provided in Appendix 5 TASK 4 2 CREATE STUDY REGION TASK 4 2 1 CREATE A STUDY REGION A Study Region must exist before User Defined Facilities can be imported
44. lways 12 hours away Sensitivity validation studies needed to instill confidence in the modeling results Allow the user to modify add State names in syHazus mdf e g StateName Singapore StatelD SG and StateFIPs 99 Provide the ability for user defined Study Region boundaries August 2014 Page 5 Hazus Data Management International Workflow Version 2 2 Data 3 0 FILE MANAGEMENT BACKUPS Hazus does not support server based workflows Therefore the HIPOC project is based on work that is performed on a local drive User logins are sufficient Hazus no longer requires administrator passwords Work performed on local PCs will need to be periodically secured References to the Q drive in this workflow refer to the backup server used at D3 C Projects Hazus_Projects Hazus_International Local project drive Q Hazus_International Backup drive PROJECT MANAGEMENT Project documentation is stored under the following directory structure C Projects Hazus_Projects Hazus_International Project_ Management Status Project management progress reports Advisory Reference materials for HIPOC implementations Workshops Meetings and workshop materials DATA MANAGEMENT Data sets are managed under the following directory structure C Projects Hazus_Projects Hazus_International Data_Management Data_Sources Pre processed data Hazus Updates Updated statewide tables Models Analysis data and results DATA SOURCES Data sou
45. oading from a table e Attempts to import directly from an ArcGIS personal GDB fail with the following error screen PD TISHELLLib AzShortCutMenu ShowlmportDialos exception E Code 8004060 Code meaning IDispatch error 1036 Source CM Hazus MhiComponentsadtiShellHzshoreCutmMeno cop Description Error in method CreateTablelnTempliata MSSQL table hzUserDefinedFlty August 2014 Page 53 Hazus Data Management International Workflow Version 2 2 Data 3 0 e Spatial accuracy improves when loading from a table with Lat Lon rather than loading from a GDB However the spatial locations of records imported from a table to not exactly line up with the original GDB points The UDF point locations are within 5 meters of the source Layers E Clay StateBldgs 4 p E UserDefinedFlty H UserDefinedFlty e Validation feedback is poor there is no reporting of the data elements that cause the imports to fail PDTISHELLLib Hz5horiCuiMenu ShowlmportDialog exception ES L A SO August 2014 Page 54 Hazus Data Management International Workflow Version 2 2 Data 3 0 Appendix 4 SQL Server Hints The Study Region graphic tables are stored in an Access geodatabase and linked to attribute SQL Server tables The Geodatabase can be maintained in ArcGIS and the SQL Server tables can be maintained using Access SQL DATABASE CONNECTION USING ACCESS 2003 The location of the Hazus Study R
46. ol August 2014 Page 15 Hazus Data Management International Workflow Version 2 2 Data 3 0 2 3 bndryGBS_hzBlock File Edit View Insert Readers Transformers Writers Tools Help SD AER BY oO MOAR HR SE SSR Navigator x H E syBoundary GEODATABASE Y bndrygbs GEODATABASE M 4 3 Transformers MJ Bookmarks Eh Published Parameters pd Destination ESRI Geodat Main 2 15 Tool Parameters 36 Tool Name lt not set gt er Workspace Descr 7 ae 2014 08 20 16 02 51 3 4 O O STATS hzExposureOccupB i Gaa ee LA 2014 08 20 16 02 54 3 4 OLOISTATS InzSqFootageOccupE Z Embedded Transformers 2014 08 20 16 02 51 3 4 O O STATS hzTract EMS Recent 2014 08 20 16 02 51 3 4 O O STATS oo oo oo ooooooo o ExpressionEvaluator 2014 08 20 16 02 51 3 4 0 0 5TATS Total Features Written 258 StringConcatenator 2014 08 20 16 02 51 3 4 O 0 STATS a a a a 2014 08 20 16 02 51 3 4 00INFORM Translation was SUCCESSFUL with 0 warning s 258 feature sW6632 ci LR b e Save the FME log file to Hazus_International Data_Management Models ND Reports Logs ND _FME_bndryGBS_hzBlock_140819 txt August 2014 Page 16 Hazus Data Management International Workflow Version 2 2 Data 3 0 Moss tosses E E ss RC oef asme s Taa aaea 102885768 25 23550015 aaas 10382888 gt T names T 140825 Ramm 25 TTT e Edit the tool named 4 bndr
47. on US English version of windows Short of re installing Windows with the US English code there are no easy solutions Errors may come from SQL Server which insists on consistency between the collation sequence used to create the databases which is US English and collation sequence of the target OS PIO If we pass the correct collation type see below during Hazus install time when SQL Express is being installed we may end up with an international version of Hazus Alternatively we have a separate SQL Server install after Hazus has been installed EF Connection Properties Current connection properties amp A ms 2 Authentication Connection E Product Server Environment Collation Collation of the server product e Changing Regional and Language options highlighted in yellow may salvage re installing Windows and Hazus It s not ideal to force non US computers to use US keyboard character set etc but it may work August 2014 Page 50 Hazus Data Management International Workflow Version 2 2 Data 3 0 Regional and Language Options Regional Options Languages Advanced Standards and formats This option affects how some programs format numbers cumrencies dates and time Select an item to match its preferences or click Customize to choose our own formats Samples Number 123 456 789 00 Currency 423 456 789 00 Time 3 39 56 PM Short date 71842072 Long date Wednesday
48. oses use with caution D3 is not responsible for the content of this workflow the models or the final loss estimates there may be errors or omissions August 2014 Page 3 Hazus Data Management International Workflow Version 2 2 Data 3 0 DESIGN CONSIDERATIONS Hazus is intended for US consumption It comes pre packaged with US datasets and can run out of the box Therefore limits to this design will be encountered when adopting Hazus models to international projects 1 2 The Hazus program cannot be altered we do not have access to the core code International customization is made to the supporting databases that Hazus uses as inputs to the model HIPOC Version 2 2 assumes that the user has a good understanding of the Hazus and CDMS data requirements Other data management tools are needed to create Hazus compliant data from local sources The HIPOC ETL tool of choice is FME Safe Software A solid Hazus installation is required To confirm create a Study Region in Boone County Indiana without error conditions See Appendix 3 Hazus Hints Version 2 2 of the HIPOC is focused on creating a Study Region anywhere in the world The Study Region is empty but the structure will support inventory updates using CDMS There are known design limitations 1 ArcGIS Ver 10 0 FME tools no longer support ESRI Ver 8 1 geodatabases The Hazus GDBs have been upgraded to 10 0 Hazus seems to support 10 0 GDBs but further
49. our non structures e g Swimming pools patios sheds parking lots etc Convert Area to sq feet and Cost to US dollars Calculate NumStories Height 3 28 10 Calculate BldgArea Area NumStories Calculate FirstFloorHt Elev 3 28 Normalize Occupancy Codes based upon FT ype Refine RES occupancies from numbers of occupants Calculate YearBuilt AssessmentYear Age Derive missing YearBuilt values from nearest neighbor August 2014 Page 26 Hazus Data Management International Workflow Version 2 2 Data 3 0 _FType Address NoDeck Code NumBasement Height Elev Wye 4756 11 19 Existing NONPROFT ojeoon 14 1 533333 pj 3655 1 18 Existing NONPROFT 0 Go0p 0 m 12 pj ars 2 18 Existing NONPROFT 0 coop 9 3 233333 E TASK 3 2 MIGRATE BUILDING INVENTORY The Improvements generated from CBD_Buildings will be used to create Building Inventory Building Inventory becomes the foundational feature class for updating the Hazus General Building Stock aggregated data and Hazus User Defined Facilities individual points Where Improvements are unique to Newland content and structure Building Inventory is generic consistent content and structure between projects August 2014 Page 27 Hazus Data Management International Workflow Version 2 2 Field mappings and data loading algorithms built into the Improvements tools are provided in Appendix 3 TASK 3 2 1 CRE
50. rces received from various agencies are organized by geography Data_Management Hazus_International Data_Sources ND Newland national data HAZUS UPDATES Updated Hazus inventory is organized by country and inventory type The Hazus Updates folder is where the Hazus inventory is replaced from the data in Data_Sources Data_Management Hazus_International Hazus_Updates ND User Defined Facilities UDF MDBs for import into Hazus Tools FME scripts to create the CDMS import MDBs Templates Empty MDB schemas Statewide ND Hazus tables Working Temporary area for work in progress MODELS The Models folder contains the results of the analysis as well as the hazard and inventory datasets used as inputs Modeling is performed in a Hazus Study Region built by country state or County Data_Management Models ND Newland model results Template Templates and tools used for next HIPOC Data_Management Models ND Analysis Updates to hazard and inventory databases HPR Exported Hazus Study Regions MXD_Documents Production and final mapping documents Reports Documents and logs August 2014 Page 6 Hazus Data Management International Workflow Version 2 2 Data 3 0 ANALYSIS Subfolders under Models contain tools documents and reports used in the development of the model Data_Management Models ND Analysis Flood Flood hazard updates and loss results Analysis Inventory Building Inventory Analysis Tools Data processing tools Report
51. replacement cost REPORTING e Hazus UDF reporting options are weak HIPOC Ver 2 3 will explore GBS inventory to unleash better reporting tools debris shelter business interruption losses etc HAZUS e Compress the Study Regions the SQL log files are huge e HIPOC Ver 2 3 will explore the idea of syHazus mdb changes to use the existing country abbreviations e g SG for Singapore rather than existing state abbreviations e g ND for North Dakota August 2014 Page 48 Hazus Data Management International Workflow Version 2 2 Data 3 0 Appendix 3 Hazus Hints INSTALLATION Installing Hazus software in the US is a tricky proposition Preparing a working Hazus environment on a non US machine will take some effort Here are some generic things to be mindful of e You MUST have full admin privileges to install Hazus and its service packs This is the most common problem e The PC must be configured for Hazus 2 1 This means same operating system no upgrades service packs and same version of ArcGIS no upgrades service packs Hazus SP1 SP2 and SP3 may support newer upgrades but check the install notes just to be sure e Make sure all ArcGIS extensions have Hazus certified service packs In particular Data Interoperability Extension must be FME 2010 SP2 e Hazus installation is a one time shot In other words if SP2 does not install properly say because someone before you hit Yes and did not hav
52. roblem Event Name APPCRASH Application Name Hazusp exe Application Version 12 0 0 0 Application Timestamp 4e26d9e9 Fault Module Name StackHash_d957 Fault Module Version 6 1 7601 17725 Fault Module Timestamp 4ec49b8f Exception Code 0000374 000cebc3 Exception Offset August 2014 Page 49 Hazus Data Management International Workflow Version 2 2 Data 3 0 Review the log files first DTSLOG FLDTSLOG AGGREGATIONLOG in the Study Region folder If all OK then check the syHazus mdf dbo syStudyRegion and change Ay Microsoft SQL Server Management Studio File Edit View Debug Query Designer Tools Window Community Help New Query Ly E El Change Type S Object Explorer PUG IN POLIS 79 HAZ bo syStudyRegion _ Laie Connect Sy Sy 2 SA HasFlHazard HasHuHazard Valid Eg AL l T IN_ ioe kson FL FDGS 5 True False Tru False ES E PR_Carolina_FL_UDF True False JE False El ig syHazus E F True False False H La Database Diagrams CA Tables E La System Tables E E dbo clBldgTypeHu True False VF False a 2 dbo syStateSel True False T False a El dho syStudyRegion a Fake E Columns New Table m La Keys Design m La Constraints Select Top 1000 Rows Ca Triggers s a Bate a Edit Top 200 Rows iit KI K Cellis Modified 0 Seriot Table as True False Trun False True False Tru False False False TU False False TU False Policies Facets Properties e Do nottry to install Hazus on a n
53. s General Building Stock Term Hazus aggregate inventory by Tract or Block Study Region Term Hazus modeling extent a state Study Area Term Hazus area extent a country User Defined Facilities Term Hazus point inventory CBD Term Dummy Study Region subset of Newland Newland Term Dummy country equivalent to a Hazus state August 2014 Page 47 Hazus Data Management International Workflow Version 2 2 Data 3 0 Appendix 2 Issues and Questions Modeling issues were discovered during HIPOC Ver 2 2 Some have been fixed but others need consideration before proceeding to the next country HAZARDS e Flood boundaries and or depth grids may not be available for other counties ENVIRONMENT e Do we need to consider SQL Server Management Studio Scripts are provided to compress the Study Regions without needing SQL Server e Hazus 2 1 SP3 released on 20 Feb 2012 was used to develop the HIPOC Ver 2 2 workflow The tools may need to be re run on the current release WORKFLOW e Sometimes we inherit attributes from surrounding BI e g Year Built This was not done in HIPOC e SQL Server Management Studio is a better option for reporting Currently the workflow is based upon Access ODBC connections to SQL Server which assumes that SQL Server Management Studio is unavailable e The xFactors sqft are derived from Newland We need to determine new xFactors for each country The Newland xFactors were based upon market value not
54. s ND Reports Logs ND_FME_bndryGBS_hzTract_140819 txt TASK 1 1 5 UPDATE MAPPING SCHEMES Each Block has a corresponding record in MSH mdb to show the distributions of building types and e Open Models ND MXD_Documents ND_Hazus_Boundaries mxd e Edit the tool named 5 MSH_SchemeMapping Set the Destination Geodatabase to the MSH mdb to be updated Run the tool August 2014 Page 18 Hazus Data Management Version 2 2 amp 5 MSH _fiSchemeMapping Edit View Insert Readers Silt FPR x File p bndryabs GEODATABASE_MDE B j MSH GEODATABASE_MDB E gt Transformers E r Published Parameters Transtoriner Gallery E SI Categorized D Embedded Transformers E JD Recent GES Search Results Transformers Writers Tools E la g NAS 2014 08 20 16 41 52 2014 08 20 16 41 52 2014 08 20 16 41 52 2014 08 20 16 41 52 2014 08 20 16 41 52 2014 08 20 16 41 52 2014 08 20 16 41 52 2074 08 20 164152 IS a 141521 i ala Save the FME log file to Log Transformer Description International Workflow Data 3 0 2 2 o OISTATS 2 21 0 0ISTATS 2 21 0 ne podi meMapping 2 al n a ae ation was SUCGCRSSFLI with 0 warninoars lag i Hazus_International Data_ Management Models ND Reports Logs ND _FME_MSH_SchemeMapping_140819 txt The ND statewide tables are now formatted to run with Hazus They may be used to create a Study Region but the
55. s Output maps tables reports and logs Tools FME scripts used to create BI GDBs Working Temporary area for work in progress Reports Workflow Project workflow document August 2014 Page 7 Hazus Data Management International Workflow Version 2 2 Data 3 0 DOCUMENT MANAGEMENT The workflow document is maintained by D3 for use by both teams working on the Pre Disaster Mitigation Hazus project for HIPOC The name of the file is Workflow ND_CBD_Workflow_v lt V gt _ lt R gt docx where lt V gt Version number 1 9 lt R gt Revision number 1 9 The following abbreviations are used throughout the document TBD To Be Determined PIO Process Improvement Opportunity Name Contributions required by Rev Major revision marker Note Miscellaneous hints to the reader Versions are incremented with each project milestone Version Date Change 2 2 12 Aug 2014 Document started August 2014 Page 8 Hazus Data Management International Workflow Version 2 2 Data 3 0 WORKFLOW OVERVIEW Generic tasks to update Hazus v2 1 databases to support flood models in Newland are described below The user will be able to create Study Regions inside their home countries based upon the workflow and tools developed for CBD WORKFLOW DIAGRAM Hazus geographical aa divisions Y Study Region Aggregated data geographical divisions geographical divisions Census Tract AMI Census Tract y V Census Block gt Census Block ea a
56. source materials The HIPOC process to perform flood analysis requires a pre defined flood depth grid Option 4 Option Analysis Conditions for Use 1 H amp H SRP Hazus generates flood boundaries and flood depth grid from the DEM Single return periods usually 100 year are used 2 H amp H FIS Hazus generates flood boundaries and flood depth grid from the DEM Return periods are enhanced using FIS discharge values 3 EQL Hazus generates the flood depth grid from the DEM and provided flood boundary 4 FDG Hazus models the losses from a user defined flood depth grid that has been generated from another modeling application such as HecRas Note Option 4 is documented Consult the Hazus Flood User s Manual for Options 1 3 using alternative data sources TASK 5 1 IMPORT FLOOD DEPTH GRID Option 4 is the preferred option when third party flood depth grids are available Hazard User Data Depth Grid to import the FDG Hazard Scenario New Hazard Riverine Delineate Floodplain Analysis Run Results User Defined facilities ae TASK 5 1 1 PREPARE FLOOD DEPTH GRID The flood depth grid to be used is specified in the User Data area A relationship is built between the flood depth grid and the flood scenario e Hazard User Data e Select the Depth Grid tab e Use the Browse button and navigate to Data_Management Data_Sources Newland Hazards cbd 2 8m e Setthe Parameters for Flood Depth grid Units to
57. t the User Defined Facilities into a Study Region using Hazus Import tools are documented in the Hazus Flood User Manual Note The tools also support earthquake models To import UDFs into an earthquake model use the existing workflow and change the hazard type from Flood to Earthquake TASK 4 1 1 CREATE UDFS FROM BUILDING INVENTORY A FME script named BI 2 UDF migrates Building Inventory to UDFs Building Inventory is re processed to fit the Hazus database structure and domains The script is setup for CBD but may be modified for other Regions e Open ND CBD Hazus Updates mxd e f needed add the HIPOC Newland FME toolbox to ArcTools from Models CBD Tools ND_CBD_FME_Hazus_Updates tbx e Right Click Edit the BI To UDF tool August 2014 Page 34 Hazus Data Management International Workflow Version 2 2 Data 3 0 ArcToolbox ali 3D Analyst Tools H Analysis Tools E 3 ND CBD Hazus Inventory Updates E E BI To GES e Setthe input Parameters Source to Models CBD Analysis Inventory Building_Inventory ND_CBD BI GDB mdb BI_FP_100 e Setthe output Parameters Sources to Models CBD Hazus_Updates User_Defined_Facilities ND _CBD_Hazus_Import_UDF mdb BI To UDF File Edit View Insert Readers Transtormers Writers Toole Help woes ye xmas eme LER SSEM Navigator yn E B ND CBD BI GDB are PI Y E ND_ CBD_Hazus Import UDF IMDB T Transformers Te ln ER ST LE LatLon_Exire oC l E ak s Pub
58. t yymmdd gt txt RS EE ST File Edit View pgert Readers Transformers Writers Tools Help Bae co Bix bY III X ER SSeS HB S Navigator x a m e PR_Carolina_Working_GDB GEGDATAI E PR_Carolina_Improvements_GDB GEC ES Transformers W Bookmarks SS TE Sag l Published Parameters 58818 n a Zero_Nulle T es a He Source ESRI Geodatabase MDB pl E C ijt Destination ESRI Geodatabase mc OUTPUT p a L lL a Workspace Resources E OUTP k Tool Parameters Bad 35 Tool Name lt not set gt nee JE Workspace Description lt not set gt piP Eo Destination Redirect No Redirect T Meal t INE 2014 09 95 11 44 15 TAL a D OISTATS DD on Mg Er dE Categorized 2014 02 25 11 44 15 74 9 DOJSTATS Features Written Summary ined Embedded Transformers ey Se STATS gt X Recent izr Copy ITATS JImprovements 54862 12074 Select All ATS Total F eret Written 54362 elect All TATS l E Search AFORM Translation was SUCCESSFUL with 0 warning s 54862 fez x earch F Search Next E SSL Word Wrap Save to File e Exit the FME workbench and save the changes to the ND CBD Buildings_To_Improvements tool The algorithms used in the Buildings 2 Improvements FME script are provided in Appendix 3 In general Filter out unwanted records Vacant and Abandoned Filter
59. ted to RES1 Transformer Transformer Name Cat2hzOccCode Source Attribute Categor Y New Attribute Name hzOccCode Value Mappings Default Value INDS IND2 IND 2 IND 2 RES2 Fl Reverse Mapping er August 2014 Page 62 Hazus Data Management International Workflow Version 2 2 Data 3 0 Appendix 6 FME Algorithms User Defined Facilities The following filters and mapping schemes were applied to create User Defined Facilities from Building Inventory BUILDING INVENTORY TO UDF Populate UDF fl Design Level from BI Year Built Year Built Design Level lt 1950 1 1950 1970 2 gt 1970 3 Transformer Transformer Name Parameters Pass Criteria One Test OR sente Test Clauses eft Value Op Right Value Reverse populate UDF fl Foundation Type from BI fl Foundation Type The codelist value is needed in SQL AN 1 not the description Transformer ca e Transformer Name fiFdinType_Mapper Source Attribute New Attribute Name AFdtnType Value Mappings Default Value 7 Slab 7 Crawl 5 Basement 4 Ia Reverse Mapping Import o e August 2014 Page 63 Hazus Data Management International Workflow Version 2 2 Data 3 0 Populate UDF hu Building Type from BI Occupancy Code Records that don t match will be defaulted to WSF1 PIO Modify to include fl Building Type matches Wood Steel
60. this Group iw Word RTF File Delete Fr Access Database Text File Que XML File BLFP Reporter tl pink TRW ODBC Database ER BLFP Summary Eh v ab Snapshot Viewer rts NAN A T 04 HTML Document dBASE File Paradox File Lotus 1 2 3 File Merge it with Microsoft Office Word e Export the report named BI_FP_100 Occupancy to CBD Hazus_Updates Tables ND CBD BI FP 100 By Occupancy pdf August 2014 Page 32 Hazus Data Management International Workflow Version 2 2 Data 3 0 ci 11 Mar 14 Exposure Inside 1 Chance Flood Boundary E AE 1 08 10 PM Carolina Puerto Rico Occupancy Building Building Building Average Average Area Replacement Cost Count Blde Area Bldg Cost Commercial 5 223 920 376 153 868 884 5 909 425 513 Education 1 124 316 80 954 290 98 11 473 826 064 Government 2 097 575 151 043 512 501 4 187 301 484 Industrial 5 554 645 399 941 460 229 24 256 1 746 469 Religious 4 797 345 409 1 4 97 345 409 Residential 30 941 875 2 830 771 667 11 585 2 671 244 348 44 947 128 3 839 210 206 13 298 3 380 288 706 Page 33 August 2014 Hazus Data Management International Workflow Version 2 2 Data 3 0 TASK 4 UPDATE HAZUS INVENTORY Hazus 2 1 comes bundled with default modeling data The Hazus default data is segregated into geodatabase tables for each State The Statewide data is the master from which Hazus Study Regions are extracted Hazus performs natural disaster analys
61. tprints polygons must be clipped to the CBD boundary and converted to points centroids A working folder and GDB has been setup to prepare the source data e Open the MXD named Models CBD MXD_ Documents ND CBD BI Updates mxd e ArcToolbox Analysis Tools Extract Clip to export Region1 Buildings within the CBD boundary to the Working GDB Send the output to Models CBD Analysis Working ND CBD Working GDB mdb ND CBD Working GDB mdb CBD_Footprints e ArcToolbox Data Management Tools Features Feature To Point to export CBD_Footprints to the Working GDB Send the output to Models CBD Analysis Working ND CBD Working GDB mdb ND CBD Working_GDB mdb CBD_ Buildings TASK 3 1 3 CONVERT BUILDINGS TO IMPROVEMENTS August 2014 Page 24 Hazus Data Management International Workflow Version 2 2 Data 3 0 CBD_ Buildings will be combined to a common Improvements feature class Missing values will be populated Populated values will be migrated unpopulated values will be defaulted or derived from other values The scripts are setup for CBD but may be modified for other Regions e Add the HIPOC CBD FME BI toolbox to ArcTools from Models CBD Tools ND CBD FME Bl tbx e Right click Edit the 2 Buildings 2 Improvements tool to open up the FME workbench S ArcToolbox a 8 3D Analyst Tools 7 85 Analysis Tools S S Carolina FME Building Inventory dl pm 1 Create Municipio Boundaries Fine EL Z Clase
62. ugust 2014 Page 2 Hazus Data Management International Workflow Version 2 2 Data 3 0 PROJECT OVERVIEW In recognition of the importance of planning in mitigation activities the Federal Emergency Management Agency FEMA has created Hazus a powerful geographic information system GIS based disaster mitigation tool This tool enables communities of all sizes to estimate damages and losses from hurricanes floods and earthquakes to measure the impact of various mitigation practices that might help to reduce those losses Hazus is designed for use in USA Universities and organizations around the world are investigating the potential use of Hazus as a regional solution to model natural disasters for local mitigation projects Data 3 0 and the National University of Singapore NUS participated in a Hazus International Proof of Concept HIPOC in July 2012 The goal was to integrate local data sets into Hazus to estimate losses resulting from sea level rise in Singapore ArcGIS and FME tools were used to customize the building inventories to a Hazus compatible format The building inventories were classified with the help of the Comprehensive Data Management System CDMS for flood hazard modelling A flood depth grid representing a sea level rise of 2 8m was imported into a Hazus study region created for the Central Region of Singapore Several workflows were tested and a HIPOC methodology was developed and demonstrated at NUS HIPOC Version 2 1
63. will export the regional boundaries to ND syBoundary syTract The County boundaries are deaggregated polygon parts are made into separate Tracts and the following attributes populated sy Tract Tract 38001000001 through 3800n00000x sy Tract CountyFips 38001 through 3800n sy Tract Tract6 000001 through 00000x syTract TractArea Shape_Area 4 754 in sq miles August 2014 Page 14 Hazus Data Management International Workflow Version 2 2 Data 3 0 he an Tracte CountyFips 0 413321 416339 103 697026 38001 07324762 425313 103 656053 1 1 1 1 1 02145 1 408215 10392 0 167337 1 437838 103 865768 3800 m Area l 0 2145 5001 000007 000007 1 412222 104 025 38005 MME 1 3 ORRAST 103 938714 38005 Tp 103 83353 3 D 3 068457 1 497987 fa e Save the FME log file to Hazus_International Data_Management Models ND Reports Logs ND _FME_syBoundary_syTract_140819 txt TASK 1 1 4 RELATE TRACT TO BLOCK BOUNDARIES Tract and Block boundaries are stored in bndrgygbs mdb and they need to be generated before a Study Region can be made Tract and Block boundaries will be defined by the syTract boundaries previously created in syBoundary one Block per Tract e Open Models ND MXD_Documents ND_Hazus_Boundaries mxd e Edit the tool named 3 bndryGBS hzBlock Set the Destination Geodatabase to the bndryGBS mdb to be updated Run the to
64. y will be empty All tables have been populated with default values and must be updated using local data sources Data population strategies are provided in HPOC Ver 2 3 The following sections describe how to model a flood event using local Building Inventory August 2014 Page 19 Hazus Data Management International Workflow Version 2 2 Data 3 0 TASK 2 PREPARE CBD DATA SOURCES The General Building Stock and Essential facility databases were flushed out in Task1 The GBS and EF records will be updated for CBD using local data sources in Task2 A modeling folder structure will be setup that contains the source materials mapping templates tools source data sets and final reports TASK 2 1 PREPARE DATA SOURCES TASK 2 1 1 DATA EXCHANGE Box net is the preferred HIPOC data portal Typical data sets to exchange include e Inventory sources Building Footprints e Hazard sources depth grids The HIPOC data portal can be accessed at the following URL It is password protected https oox com hipoc ghdiwbcdxfdtle kopw8 Hazus PuertoRico Box Mozilla Firefox File Edit View History Bookmarks Tools Help AMEN RON LEES Te A bittps iu box com files 0 TE Hazus PuertoRico Box Search Files All Files Hazus_PuertoRico WTF Hazus PuertoRico Upload or New 7 Folder Options Files and Folders Discussions Sharing amp Link httpsiu box com s gh5i LL 2 Collaborators Mail all Jack
65. yGBS hzTract Set the Destination Geodatabase to the bndryGBS mdb to be updated Run the tool August 2014 Page 17 Hazus Data Management International Workflow Version 2 2 Data 3 0 S 4 bndryGES_hzTract File Edit Viemme Readers Transformers Writes Tools Help DOE 1 AAA ee ay Y L bndrygbs GECDATABASE lr syBoundary GEODATAB la bndrygbs GEODATABASE H 3 Transformers gt eal fe Source ESRI G E Edit Published Parameter 14 08 20 16 10 87 3 41 0 OJSTATS hzBidgCountOccu Be 16 10 57 241 0 DISTATS inzDemographicsT 2014 08 20 16 10 57 3 4 OLO STATS hzExposureContentOccupT 2014 06 20 16 10 57 3 4 DOJSTATS hzExposureOccupT E 2014 08 20 16 10 57 3 4 O O STATS hzSqFootageOccupT H JD Categorized 2014 08 20 16 10 57 3 S 0 bape hzTract CD Embedded Transformers 2014 06 20 16 10 57 9 JC Recent 2014 08 20 16 10 57 341 0 DISTATS Total Features Written H E Search Results 2014 08 20 16 10 57 3 4 O 0 STATS 2014 08 20 16 10 57 3 4 OLO INFORM Translation was SUCCESSFUL with 1 warning s 216 feature s 829 cot 4 2014 08 20 16 10 57 3 4 CLO INFORM FME Session Duration 4 7 seconds CPU 3 Ye user 0 4s system nina a nn am 460 479 071 A AE P A A ll 1 Mee nnn AAA mande a eee a A AA LT rn PE n F pe WE tos Manon Description A e Save the FME log file to Hazus_International Data_Management Model

Download Pdf Manuals

image

Related Search

Related Contents

Pont Arrière RENAULT 669  Zanussi ZG 1102 User's Manual  Canon 310 HS Digital Camera User Manual  Genicom PC4 User's Manual    情報通信審議会総会(第19回)議事録 第1 開催日時及び場所  rassegna stampa Protezione civile 29 novembre  Telephone Troubleshooting Guide  Ultraclear™ Monitor Ultraclear™ Moniteur Monitor Ultraclear™  Homeowners Guide  

Copyright © All rights reserved.
Failed to retrieve file