Updating Addresses and Adding New LandCare Sites for PHS

During sprint two of my capstone with PHS, I worked on finishing the update structure for LandCare Project shapefiles. As we were updating both parcels and sites, code was required for each subset of data. To streamline the process, I used metacoding like what was posted in my first blog post. There were two main processes that were coded in this sprint: address updates and the addition of whole new sites. Both will be discussed in this blog post.

I have found it best when explaining the parts of this project to think about parcels and sites as Lego bricks and structures respectively. Consider an individual Lego brick. Singularly it does a job and has a function, just as a land parcel does as a singular address. But, when parcels are added together they become sites that consist of multiple addresses. This could be considered to when several Lego bricks are combined to make a structure. There is still the ability to remove or add bricks but the overall geometry of the structure will change.

This is comparable to the address addition and removal process. As we add a new parcel or brick, we must also update the sites file as the geometry is changing. This is also true for the removal of a parcel. While the changes we are making may include only the addition of one small parcel or brick, the site geometry or structure changes significantly. This requires a special geometry update in SQL to ensure that these changes are not missed accidentally in a shapefile update.

These address updates were broken into two separate metacoding files: address addition and address removal. Address addition utilized data for parcels from the Philadelphia Water Department to provide geometries for new parcels being added to sites. Each parcel was individually added to the parcels file first with the selection of the geometry and other unique fields from the PWD parcel data. Following, data was updated for the parcel file for the new address range. Other parcel information was copied from other parcels in the site we were interested in. The code for these parts has been pasted below.

INSERT INTO PLCmaint_parcels_2017 (ADDRESS, Geometry)
SELECT ADDRESS, Shape
FROM allparcels_517
WHERE ADDRESS = "1501 N 15TH ST";

UPDATE PLCmaint_parcels_2017
SET PARCELID = (SELECT PLCmaint_parcels_2017.PARCELID FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    TENCODE = (SELECT PLCmaint_parcels_2017.TENCODE FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    OWNER1 = (SELECT PLCmaint_parcels_2017.OWNER1 FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"), 
    OWNER2 = (SELECT PLCmaint_parcels_2017.OWNER2 FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    BLDG_CODE = (SELECT PLCmaint_parcels_2017.BLDG_CODE FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    BLDG_DESC = (SELECT PLCmaint_parcels_2017.BLDG_DESC FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    BRT_ID = (SELECT PLCmaint_parcels_2017.BRT_ID FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    GROSS_AREA = (SELECT PLCmaint_parcels_2017.GROSS_AREA FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    Shape_Area = (SELECT PLCmaint_parcels_2017.Shape_Area FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    PID = (SELECT PLCmaint_parcels_2017.PID FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    Year = (SELECT PLCmaint_parcels_2017.Year FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    Season = (SELECT PLCmaint_parcels_2017.Season FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    VL_Num = (SELECT PLCmaint_parcels_2017.VL_Num FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    Sq_Ft = (SELECT PLCmaint_parcels_2017.Sq_Ft FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    Trees_Num = (SELECT PLCmaint_parcels_2017.Trees_Num FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    TARGETAREA = (SELECT PLCmaint_parcels_2017.TARGETAREA FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    Grid = (SELECT PLCmaint_parcels_2017.Grid FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    CCD = (SELECT PLCmaint_parcels_2017.CCD FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    ZIPCODE = (SELECT PLCmaint_parcels_2017.ZIPCODE FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    Action = (SELECT PLCmaint_parcels_2017.Action FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    TARG_GRID = (SELECT PLCmaint_parcels_2017.TARG_GRID FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    CCD_Person = (SELECT PLCmaint_parcels_2017.CCD_Person FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST"),
    Program = (SELECT PLCmaint_parcels_2017.Program FROM PLCmaint_parcels_2017 WHERE PLCmaint_parcels_2017.ADDRESS = "1503 N 15TH ST") 
WHERE ADDRESS IN (
                SELECT ADDRESS
                FROM PLCmaint_parcels_2017
                WHERE PLCmaint_parcels_2017.ADDRESS = "1501 N 15TH ST");

UPDATE PLCmaint_parcels_2017
SET NumParcels = "13",
    Addr_Range = "1501 - 1507 N 15TH ST"
WHERE VL_Num = "VL0123EN_CLP";

Once parcel information was added and correct, the site shapefile geometry was updated accordingly. This was the product of a ST_Union function that inserted new geometries into the sites table in SpatiaLite. Address ranges and parcel numbers were updated accordingly as shown in the code block below.

UPDATE PLCmaint_SITES_2017
SET Geometry = (
                SELECT ST_Union(Geomtery)
                FROM PLCmaint_parcels_2017
                GROUP BY VL_Num);

UPDATE PLCmaint_SITES_2017
SET NumParcels = "13",
    Addr_Range = "1501 - 1507 N 15TH ST"
WHERE VL_Num = "VL0123EN_CLP";

Parcel removal required the copy of parcel data first to the New Use table so changes would be documented for the future. Once this was done, the parcel was deleted from the parcel table. The site geometry was updated following the same process as previously. Address ranges and parcel numbers were updated accordingly. This code can be visualized below for reference.

INSERT INTO NewUse_2017 (PARCELID, TENCODE, ADDRESS, OWNER1, OWNER2, BLDG_CODE, BLDG_DESC, BRT_ID, GROSS_AREA,
    Shape_Area,VL_Num, Sq_Ft, Trees_Num, PID, TARGETAREA, Grid, CCD, Addr_Range, ZIPCODE, Geometry)
SELECT PARCELID, TENCODE, ADDRESS, OWNER1, OWNER2, BLDG_CODE, BLDG_DESC, BRT_ID, GROSS_AREA, Shape_Area, VL_Num, 
    Sq_Ft, Trees_Num, PID, TARGETAREA, GRID, CCD, Addr_Range, ZIPCODE, Geometry
FROM PLCmaint_parcels_2017
WHERE ADDRESS = "1501 N 15TH ST";

DELETE FROM PLCmaint_parcels_2017
WHERE ADDRESS = "1501 N 15TH ST";

UPDATE PLCmaint_parcels_2017
SET NumParcels = "12",
    Addr_Range = "1503-1507 N 15TH ST"
WHERE VL_Num = "VL0123EN_CLP";

UPDATE PLCmaint_SITES_2017
SET Geometry = (
                SELECT ST_Union(Geometry)
                FROM PLCmaint_parcels_2017
                GROUP BY VL_Num);

UPDATE PLCmaint_SITES_2017
SET NumParcels = "12",
    Addr_Range = "1503-1507 N 15TH ST"
WHERE VL_Num = "VL0123EN_CLP";

Adding new sites entirely was a straightforward process. Four sections of metacoding were used to break down information and explain steps. The data was copied from provided shapefiles into the parcels table first, then the sites.

For the sites table update, the VL numbers (unique parcel identifier) were selected with a “SELECT DISTINCT” statement and these were copied into part three. This allowed for only the information for one parcel to be copied into the sites table, representing the data for the site that was available. Finally the site geometry was updated using the ST_Union feature described above. The code used is visible below.

INSERT INTO PLCmaint_parcels_2017 (PARCELID, TENCODE, ADDRESS, OWNER1, OWNER2, BLDG_CODE, BLDG_DESC, BRT_ID, 
    GROSS_AREA, Shape_Area, Year, Season, VL_Num, NumParcels, Sq_Ft, Trees_Num, TARGETAREA, Addr_Range, Geometry)
SELECT PARCELID, TENCODE, ADDRESS, OWNER1, OWNER2, BLDG_CODE, BLDG_DESC, BRT_ID, GROSS_AREA,Shape_Area, Year, 
    Season, VL_Num, NumParcels, Sq_Ft, Trees_Num, TARGETAREA, Addr_Range, Geometry)
FROM Fall16_Stabilization_parcels;

SELECT DISTINCT VL_Num
FROM Fall16_Stabilization_parcels;

INSERT INTO PLCmaint_SITES_2017 (VL_Num, ADDRESS, Year, Season, NumParcels,  Sq_Ft, Trees_Num, TARGETAREA, Addr_Range, Geometry)
SELECT VL_Num, ADDRESS, Year, Season, NumParcels,  Sq_Ft, Trees_Num, TARGETAREA, Addr_Range, Geometry
FROM Fall16_Stabilization_parcelsGROUP BY VL_Num HAVING VL_Num = "VL0123EN_CLP";

UPDATE PLCmaint_SITES_2017
SET Geometry = (
                SELECT ST_Union(Geometry)
                FROM PLCmaint_parcels_2017
                GROUP BY VL_Num);

Any resulting data from these sections were queried to double check the work. My sprint three blog will cover the exporting of these tables to shapefiles and will use visualizations.

Using Metacoding for Philadelphia LandCare Site Manipulation

For my capstone project, I will be working with Pennsylvania Horticulture Society to develop a streamlined data management system for their LandCare Program. The LandCare Program strives to rehabilitate vacant land parcels in the city of Philadelphia to address urban revitalization and land vacancy issues plaguing the city. The image below shows what some of these land areas can look like.

The project has so far consisted of the construction of a database in SpatiaLite to work with the shapefiles for the project, see the image below for an overview of the GUI. I selected SpatiaLite as it uses the SQLite language and connects seamlessly with ArcMap. This ultimately allows the user to create an manipulate geodatabases using code written in SpatiaLite and immediately access the results in ArcMap.

My first sprint (May 30th to June 8th) consisted primarily in the set up of documentation for the project, in addition to the following items: creating a geodatabase, working to remove sites that have new uses (no longer PHS LandCare sites) as well as writing base code for updating tables based on an excel.

The process to create documentation was decided upon so formal directions would be available for anyone working on this project in the future. In addition to this, I have detailed notes of what I have done thus far and can change information as the project progresses.

Most of the coding that has been done thus far has been metacoded in excel. By doing this, the overall update process is now streamlined to where someone with no SQL experience can use the excel and get valid results. Using the image below, an example to delete a row from a table in SpatiaLite, it is visible that the user only must input a small amount of information such as the table name and the identifier for the row being deleted. This will benefit not only the inexperienced coder in the future, but also anyone who must update many files at once as only one column must be manipulated. The end result will be to gain statements that can be copied and pasted in the query window of SpatiaLite.

So far, there are metacoding files for updating new use sites and updating columns overall. When updating new use sites, we must copy information from both parcels (individual lots) and sites (many lots together) to develop one overarching row of data. This process was broken down into four steps, two with updates and two for deleting the land area from the new parcel and site tables as they are no longer in the maintenance database. The individual update metacoding was created for a quick update like square feet. Using this process, much information can be quickly added to the pre-written code and then executed in SpatiaLite.

Finally, in sprint two, the next process will be determining the process to update parcel and site data for address changes. As many sites may lose a parcel or gain a parcel over the course of the year for maintenance purposes, it is important that adequate files are maintained on current geometries. This will most likely be completed again with many steps in the form of a metacoded file.

Find the best matched polygons for use in a time series using pgAdmin

Goal:
Find the best matched polygons in a time series using a PostGIS Spatial Database
Background:
For previous project a comparison of housing data from the 1930s to current and historic US Census data was required. The geographic portion of the census data, was obtained from the National Historical Geographic Information System (NHGIS) which produces downloadable shapefiles with a GISJOIN field allowing for convenient linking of the shapefile to tables of census data also available through NHGIS. The 1930s housing data was in the form of an image of a 1937 Home Owners’ Loan Corporation (HOLC) map of Philadelphia available here. Very broadly the HOLC graded neighborhoods based on race/class and these grades where then used to determine the likelihood of housing loan assistance. There is debate if the HOLC process normalized race-based home lending practices and then the lead to further housing issues such as redlining. more info.

Process:
The HOLC map data was pulled into a GIS and a shapefile was created of the neighborhood grading (this process is often listed as ‘Heads-up Digitizing’ in job postings) here is a handy tutorial for ArcMap and another using QGIS. Shapefile and HOLC jpeg available on github.

The HOLC neighborhood grading schema does not coincide with any current (or historic) census arrangement so the spatial querying capabilities of pgAdmin and the PostgreSQL database engine was used to match the newly created HOLC digital data with the NHGIS Decennial Census geographic data.
Create a PostGIS Spatial Database and use pgShapeLoader to load the shapefiles (HOLC and NHGIS spatial data)

After much trial and error the following code worked the best:

create table overlap2010census as
with limited as(
select distinct us2010.*
from us2010, holc1937
WHERE ST_DWithin(us2010.geom, holc1937.geom, 0.25)
order by us2010.gid
)
/* the section above limits the query of the 2010 census polygons (Entire US) to those near the HOLC polygons –input can vary based on projection of files*/

SELECT DISTINCT ON (limited.gid)
limited.gid as us2010a_gid,
holc1937.gid as holc_gid, holc1937.name, holc1937.grade,
st_area((st_transform(limited.geom,4326)::geography)) as area_check,
ST_Area(ST_Intersection((st_transform(limited.geom,4326)::geography),
(st_transform(holc1937.geom,4326)::geography))) as intersect_area_2010_meters,
(ST_Area(ST_Intersection((st_transform(limited.geom,4326)::geography),
(st_transform(holc1937.geom,4326)::geography)))/
st_area((st_transform(limited.geom,4326)::geography)))*100 as percent_overlap,

/*calculates the area of the returned census and HOLC polygons and the percent of overlap */

limited.*
FROM limited, holc1937
where limited.geom && ST_Expand(holc1937.geom,1) and ST_Area(ST_Intersection(limited.geom, holc1937.geom)) >0

/* joins the files where the geom overlap*/

ORDER BY limited.gid, ST_Area(ST_Intersection((st_transform(limited.geom,4326)::geography),
(st_transform(holc1937.geom,4326)::geography))) DESC;

/* sorts by the largest amount of overlap, and in conjunction with the DISTINCT ON call
only returns the first instance of the census polygon that overlaps the target polygon */

While this method requires the user to update and repeat this code for each comparison the tables created can then be exported out to your GIS or QGIS can connect directly to your database if you prefer that option. As the query also created a unique identifier for each year and polygon of the Census data, a crosswalk/key table was created using this data which allowed for the Census data (race,income housing status etc.) for multiple years to be attached via the NHGIS GISJOIN field previously mentioned. Based off of the multiyear Census data attached via the crosswalk table comparisons of the HOLC geographic areas could then be made regardless of shifting Census borders.