Upcoming Conference, Connected Pasts London 2014

Last month I finished the last major chapter of my PhD. Last year work was slow for a number of personal reasons which made me very, very glad I switched to part-time. This situation also meant I presented at and attended few conferences, which looking back on was a real shame. I really enjoy academic or disciplinary meetings, as they can be so inspiring, even when the subject is only vaguely related to my current work. Hearing what other people are doing and in particular how they are doing it or expressing it really gets my brain going.

However since Spring this year progress on my PhD has been really steady and satisfying, and I’m now working on revisions. To celebrate I’m off to the Connected Pasts Conference 2014 in London. I’ve not been to this conference before but I was really intrigued by it’s interdisciplinary approach, which brings together scientists and archaeologists undertaking network and complexity studies on archaeological data. I’ve not a huge amount of experience with network studies, but as you can probably tell from these posts I do love exploring large bodies or archaeological data. Although there’s been no confirmation yet, there’s talk of a workshop introducing techniques for this sort of analysis prior to the conference, which I’m really interested in attending. I really want to get to grips with other ways of looking at patterns and fluctuations in data over space and time, so hopefully this conference will give me an introduction to what’s possible in network analysis.

Given the time I hope to either live tweet the conference as is my usual habit (see my twitter account @RuthFT for tweets on the day or search twitter for tweets bearing the conference hashtag #tcp2014), and write up a conference review either for this site for PIA (the Institute postgraduate journal) or another journal. It’ll be odd to be back at Imperial where I first went to University after so many years, but I’m looking forwards to it. I believe there are still tickets (and very cheap they are too) available on the Eventbrite website if you’re interested.

8th Experimental Archaeology Conference, 2014

Once again things have been quiet here during the end of 2013 as I’ve been working on supporting colleagues in Oxford who are producing the 8th edition of the UK Experimental Archaeology Conference. I think it went very well, with the organisers Christophe Snoeck & Chelsea Budd doing a great job and Merton College providing an excellent venue.

If you’re interested in what the conference is about, check out the programme over at the conference website and archive, which I created back in 2012 and manage today. It’s a thoroughly interdisciplinary meeting with academics, craftspeople and experimenters from all over western Europe. I’ve met some great people over the last few years and I absolutely love attending every year. The atmosphere is very inclusive and open, and if you do any experimental work I’d encourage you to attend. The organisers usually make every effort to fit in as many papers as they can, and there’s traditionally at least one afternoon of demonstrations by experienced crafts people of experimenters.

Review of the Roman Finds Group Conference, Spring 2013

I’ve lost track of how much time I’ve spent beavering away at the PhD, but on Friday April 19th I escaped the writing desk to visit the British Museum for a one day conference organised by the Roman Finds Group entitled The Life and times of the inhabitants of Pompeii and Herculaneum. The conference was held in collaboration with the British Museum and including entrance to the temporary exhibition Life and Death in Pompeii and Herculaneum.

Julie Casidy’s image of the famous cast of a dog from the exhibition, which died whilst chained up outside a villa, probably belonging Vesonius Primus, a fuller.

The Roman Finds Group (RFG) is a special interest group for professionals and interested others who enjoy learning about, studying and researching Roman finds (that is, small portable objects that are not pottery sherds). Membership costs a very reasonable £8 a year, for which you get the twice yearly newsletter Lucerna, as well as discounts on meetings and conferences. The cost for the Pompeii conference was £30 – or £20 if you were an RFG member – so the membership is well worth it if you enjoy Roman archaeology.

The RFG have recently redesigned their website, and at the conference they were running a twitter feed using the hashtag #rfg2013. I’ve archived all the tweets over at Storify; these were largely factual and there wasn’t really any discussion or debate occurring on the twitter channel but Nicola Hembrey in particular made a valiant effort to communicate the conference’s content. This was particularly useful as there were no abstracts et c of the papers available online, so no one following along via twitter would have had a clue what any of the papers were about otherwise.

The layout of the conference was a bit of an oddity, in the context of many I’ve been to recently, in that the majority of the papers were at least 30 minutes long rather than the usual 15-20 minutes. In addition there were no opportunities for questions after any of the papers, and we simply moved from one half-hour paper to the next. This was not particularly problematic with papers such as Alex Croom’s Housework in the homes of Pompeii and Herculaneum which didn’t seem to make any particular argument and meandered through a general discussion of some of the evidence for this rather large subject. However for papers like Ray Laurence’s Pompeii: from the city streets to people and houses where the presenter put forward several theses based on his intensive study of how street space was used in Pompeii, the lack of question time seemed a bit of a missed opportunity.

Having not been to an RFG meeting before I’m not sure whether this is a common feature of their meetings, or whether it might have been a response to the large number of attendees at the conference: there must have been more than a hundred people in the lecture theatre. However I was particularly impressed by the wide age range and the good gender balance. Archaeology societies often seem to suffer from having a top-heavy age distribution, and whilst there didn’t seem to be many students there were certainly lots of young early-career researchers and professionals. Whilst archaeology is often male-dominated at the top, and female-dominated at the undergraduate level, the distribution of gender of both attendees and presents at the conference wasn’t noticeably skewed; there were five male speakers and three female speakers.

The approach presenters took to their papers was quite varied. There was probably a 50/50 split between papers which were ‘read’ and those which were given with limited reference to written notes, though the later were the easier to follow and engage with. The clarity of layout of the papers was a little variable, with some of the presenters wandering a little, and a few presenters didn’t seem to have any particular thesis, but overall the standard was pretty high. Oddly one of the best papers was the only short paper; Andrew Jones presented for ten minutes on One pot and its story: a newly discovered amphora from a bar on the Via Consolare, Pompeii. Perhaps due to my habit of attending archaeological science conferences I am more used to his form of tight, snappy paper, but I thought his presentation was technically the best. He synthesised evidence from multiple archaeological techniques, and managed not just to explain why this was important for our understanding of the object but to also use that find to then to hint at wider socio-economic changes in Pompeii, all in ten minutes! My kinda paper, I have to admit.

Despite that Jones’ paper wasn’t my favourite paper, as I was definitely won over by Hilary Cool’s Becoming consumers: the inhabitants of a Pompeian insula and their things. Why someone hasn’t given that woman a professorship I don’t know, because she has a brilliant mind and I’ve find her work consistently solid, engaging and sharp. Here she was discussing the rise of ‘thingyness’, that is the clutter of finds that we see in Roman contexts from the 1st century, and unlike many of the other presenters she contextualised her discussion almost effortlessly within theoretical approaches to Roman objects. I suspect most people didn’t even notice she was touching on theory at all! Whilst apparently a work in progress, her paper definitely got my brain cells turning things over, and she made a number of important points. In particular she pointed out that we often fall into the trap of assuming that Roman society was always object-rich… though if we look at pre-1st century AD contexts that isn’t true. In addition she used the example of loom weights to show that even objects that we perceive as highly practical artefacts may be used entirely symbolically and we can’t assume that distribution of any objects is purely functional within consumer societies such as the Roman.

Of interest to me were the two presenters who giving papers based on their PhD theses; Ria Berg discussing Did all Pompeian women have mirrors? Investigating gender, toiletries and domestic space in Pompeii and David Griffiths with a paper on From dusk ’til dawn: lamps and lighting in Pompeii. Both of the pieces of work presented here were obviously significant and solid pieces of research covering quite large subject areas, but I felt both struggled to find a clear, simple argument that could be presented in thirty minutes. I’m sure I face exactly the same problem with presenting my work, but it underlined how easy it is to make your exciting research harder to really understand than it actually is. Definitely something to bear in mind for the future.

In addition to the oral papers, the conference fee also included entrance to the Pompeii exhibition, and at the beginning of the conference Paul Roberts, the person who put the exhibition together, gave an easily accessible half-hour paper on it. To be honest, he showed so many images of the exhibition that by the time I visited it, there wasn’t much new. I’m not a fan of British Museum exhibitions as I’ve mentioned here before, and this didn’t change my mind so I won’t go into much detail, beyond saying that as usual the exhibition is frustratingly low on information unless you pay extra for the audio guide or app.

Overall the conference was a great success, and I think the RFG really excelled. They certainly drew in a lot of people, not just archaeologists but interested members of the public, and the papers chosen were generally solid and engaging without being too technical or challenging for the broad audience. The choice of relatively long papers was a definite success, allowing the presenters to go into greater detail and tackle slightly broader topics than I’ve seen at other conferences, but I would have liked to see five minute question times available to discuss some of these papers as the majority presented new research rather than syntheses of established knowledge. Thanks to several of the presenters I left the conference with some new ideas running round my head, and look forwards to seeing what the next Roman Finds Group meeting brings.

US and England map data

Just a short post to point to a source of some really interesting map data: MapCruzin.com

If you’re looking for modern US map data, they’ve got some really interesting datasets to download, including toxic waste and census data. In addition, they have a few datasets for England which include waterway data, which really interested me. The depth of coverage of the waterway data is patchy, but there are some areas (including my area of interest!) where there’s really really indepth coverage of everything down to streams.

Best of all, it all comes as SHP files!

Importing DigiMap NTF files to ArcMap

Following on from my post pointing to free downloadable map data for the UK and Roman period, and yesterday’s post giving walk-throughs on importing three types of file available from theDigiMap to ArcMap, I’ve got another file type to discuss:

Profile and Panorama Contour vector data (NTF files)

Edina Digimap are in the process of moving all map data downloads to their main ‘Data Downloads’ interface, where you can select multiple types all on the same screen, order them, and receive an email with a link to download them all in a zip file. If you look under the ‘Land and Height Data’ section, you’ll see two options for contour data (those cool looking contour lines on maps): Profile and Panorama. Profile is the really minutely mapped one, Panorama the slightly easier overview – though they are both really densely packed with information.

These files are delivered as NTF files, which are not viewable in ArcMap 10.1 nor QGIS (the free open-source mapping software). Once again, the solution to this problem for ArcMap is to use the Productivity Suite extension, which I don’t have, so we’re going to be jumping through hoops a little this time. The solution I present here isn’t fool-proof – in fact, it doesn’t actually work all the time, for reasons I haven’t been able to fathom. As I can’t actually open the original NTF files I’m downloading, I don’t know if the problems with this work-around are due to the NTFs (which I know are sometimes missing pieces) or due to the software I’m using, or my own actions.

However, this remains the only functional solution to opening NTF files in ArcMap (and QGIS) that I’ve found, so I present it here:

Getting NTF files into ArcMap

First thing’s first: you’re going to need to convert the NTF files into MIF files. I use NTF2MIF, a free piece of software you can download here. It only has one screen and there’s limited options. Simply load in all the NTF files (it seems to be able to cope with lots in one go), and select the Output Option: Merge tiles or Separate Tiles. Both of these work, and either produce one massive MIF of all of your NTFs together, or one MIF for each NTF. I have read that you should select ‘Separate Tiles’, but this method produces poor MIFs for me just as frequently as the ‘Merge Tiles’ method, so it’s up to you. Press translate and wait for it to finish.

Now, the good news is that if you use MapInfo MIF files are easy for you, and you can go straight ahead and import them. I can’t seem to do that in ArcMap 10.1, so we have a few more steps to follow.

Download and install QGIS. Yes, this is the free open-source mapping software. It’s actually very handy to have around as well as ArcMap, because sometimes it does things much more simply and intuitively than ArcMap and, importantly for us, it can save MIF files as SHP files! You can use it as a lovely GUI for a number of python functions that honestly, I don’t have time to learn how to use. Here I’m using QGIS 1.8.0-Lisboa here.

Using ‘Add vector layer’ button (on the tool bar and featuring a layer with lines on and a + sign), add the MIF files. Now, NTF2MIF produces some text.mif files as well, which I assume contain the numbers associated with the contour line height markings. QGIS doesn’t seem to want to load these, so I’ve done without them: not ideal, but I don’t see any other solutions at this stage. However, you will have the contour lines at least. At this point, you need to right-hand click on each MIF file in the layers screen, and select ‘Save As’. Select the format you want (ESRI Shapefile), etc etc. Now, sometimes this works perfectly and sometimes you get error messages about not being able to save points. In the later case, the contours still seem to save just fine.

If you you’ve only got one SHP file, for instance if you merged your NTFs to a single MIF, you’re basically done. If not, you may want to go to Vector -> Data Management Tools -> Merge shapefiles to one. This is a really simple way of merging shapefiles, but I think they all have to be of the same type (line, poly, point etc), and you’ll need to select the right type in the dialogue. If you’ve ended up with fifty MIF files, turned them into fifty SHP files, you may want to merge them into just one SHP file this way.

 

If anyone has a better work around for NTF files, please do let me know. In particular, if you know how to get the contour height numbers into QGIS as well, I’d love to hear from you!

Learner experiences getting Edina DigiMap data into ArcMap

Following the previous post where I pointed to a few sources of map data, I thought I’d write about importing them into ArcMap 10.1, particularly because it’s been quite labour intensive and whilst there are plenty of walk-throughs on the respective websites, very few of them have solved all my problems. I should say I am really not a GIS expert. I’m just sharing some methods that worked for me, as a novice, and they are likely to be quick and dirty and potentially not the best method.

So first up are the many layers of data available from Edina DigiMap. This data is only available through an Institution, and even then they make you sign up individually and wait to get your ‘approval’. And the service logs you out automatically after 30 minutes.

It’s worth saying that when you order your DigiMap, keep the text files that come with the data files, preferably all together. These contain important metadata, and they also record the licensing conditions which you need to know for publishing your maps, etc.

Boundary data vector files (shp)

So first things first, you need an outline of the UK, or your county, or individual region (England, Wales, Scotland). You’ll need to go to the Ordnance Survey tab, click on the ‘Download OS mapping data’, and then select ‘Boundary Download’. Hey presto, ‘GB National Boundaries’ data is available at the bottom.

As I’m using ArcMap, I’m going to select ‘shp’ files where possible, because they are the software’s favourite format so a damn site easier to work with as a novice than anything else. Follow through all the screens to process your ‘order’, and download the data.

Getting the data into ArcMap

This one is easy, really. We’ve got shp files so all you need to do in ArcMap is click the Add Data button, which is a little yellow diamond with a black + sign in it. Voila!

OS Mastermap Topography raster files (TIFF)

This is the most detailed mapping layer for Britain, as 1:1000. It is almost too detailed and rather colourful, but you might want it as a base layer for mapping trenches etc on top of – it really is that detailed. It comes as TIFF files, which are image files, along with lots of other files which will tell ArcMap how to relate the TIFFs to each other.

Getting the data into ArcMap

If you only have one TIFF file, just add it like a normal piece of data. However if you’ve got lots of them, I found the method below produced the smoothest results from all the ones I tried.

Open up ArcCatalog in ArcMap. This might be a ‘Catalog’ button on the right hand of the ArcMap screen, or there’s a little symbol button for Catalog on the main toolbar. The Catalog appears to be a file management bolt-on system, amongst other things. Navigate to the place you’re using to store your GIS files. Right-hand click the folder, and go to New -> File Geodatabase and give it a name, etc. Right hand click your new geodatabase and make it your default geodatabase.

Within this geodatabase we’re going to create a mosaic dataset. Basically we want all the image files of our map in one layer, lying next to each other in the right positions, so we want a mosaic. Right-hand click on your new Geodatabase and go  New -> Mosaic Dataset. Go through the dialogues and select your options; in particular you want to make sure the Spatial References are right (ie, if you’re using British National Grid you need to select this). Make sure you name it correctly first time, it seems reluctant to change names once created.

To fill this empty mosaic dataset with files, right-hand click on it, and click Add Rasters. Make sure the Raster Type is set to Raster Database, and use the Input Data type, Dataset. In the advanced options you can specify the coordinate system, and when you’ve finished you can click OK and wait for the programme to process the command. It will take some time.

The new mosaic dataset should appear automatically in your map window; you may need to zoom in, or right-click the layer in your list of layers and select ‘zoom to layer’. If it hasn’t been added, you can drag it in from the catalog interface.

OS Mastermap Topography Vector files (GML)

Integrated Transport Network and individual topography selections at 1:2500 are produced as GML files.These are the kind of poly and line shapes you’re used to seeing and might be the best for mapping detailed stuff on top.

When you download these you get contents text files and it is important these are kept with the original files and folders. The download comes with a PDF file on how to get these into mapping software. If you have the Productivity Suite extension for ArcMap, you can follow the instructions listed there. If like me, you don’t, then you’re limited. The guide says ArcMap can open GML by just clicking the ‘add data’ button, but that doesn’t work for me at all. So that leaves one option: converting the files using InterOSpe.

Download and install InterOSpe, and start up the Processing programme, which needs you to tell it where the contents text file is so that it can read through that. You don’t need to unzip the GML folders, so leave them be. After that it’s pretty straight forwards, just click through the numerous dialogues and set the programme going. What you should end up with is lots of shp files, which you can add as usual. You end up with some which are lines and points for placing the gazetteer text; the easiest solution to this problem I’ve found so far is to right-click the ‘gazetteer’ layer (once you’ve added it), go to properties, remove the points entirely and use the ‘label’ feature to call the INDEX_NAME field and write that as a label.

There’s a crazy amount of work in turning all the different line types into the right colours, but that can all be done. InterOSpe is supposed to come with the right symbology for OSmaps, but the output still seems to need a lot of work.

Historic County Series raster files (TIFF)

There are a lot of historic maps, but as I’m looking at a rural location I’m not using the Town Plans. In the County Series there are 1:2500 and 1:10560 schemes, and you can have the original sheets, the national grid tiles or both. These are great for looking at changes in landscape, doing a map regression, or in my case for looking at the landscape of your site before the water board put a reservoir over half of it!

These are delivered as TIFF raster files, so you can follow the method discussed above under OS Mastermap Topography.

That’s it for now… there’ll probably be other posts later when I tackle some of the other file types, and possibly some discussions of the Roman datasets, though they are remarkably more easy to import (thank you, digital classicists!).

UK GIS layers and Roman map data

The first map I ever made! Bedrock Geology, with Roman roads and mines.

The first map I ever made! Bedrock Geology (DiGMapGB data), with Roman roads and mines (DARMC data). Yes, it does have the wrong projection, but you have to start somewhere…

For the last week or so I have been experimenting with something old and something new: GIS.

In particular, since I have finally given in and bought a laptop capable of handling complex tasks without freezing, I’ve installed ArcMap. This is a piece of Geographical Information Systems (GIS) software produced by ESRI, which is hideously expensive but, along with its main competitor MapInfo, is the industry standard software for producing anything from simple maps to statistical analyses of complex 3D geographical models. My copy is licensed through UCL, but if you don’t have a institution to pay for such things, I’ve been told that QuantumGIS is both free and  a pretty good alternative.

I’ve used GIS software before, at both of my HER/SMR jobs, but that was limited to popping new points on the map and adjusting the position and shape of things other people had mapped. This last week I have been creating maps from scratch for the first time.

I have to admit, I have enjoyed almost every minute of it. Maps are amazing, and being able to create maps specifically for your research is a powerful tool. It’s hard to express how exciting it is to have all your data pop up on the screen, and to be able to overlay that data on underlying patterns such as previously discovered sites, army sites, topography, bedrock geology, roads… it’s just fantastic.

So fantastic, I haven’t managed to produce my ideal maps yet. The major problem has been finding suitable data to import into my map; after all, there’s no point duplicating the great work done by lots of other people to create, for instance, Roman road maps. GIS works with ‘layers’, in the same way that Photoshop etc do, only these layers are geographically referenced. So I’ve been scouring the web for layers to use, and of course I’ve faced lots of challenges with importing different file types (looking at you, Ordnance Survey!), and setting the map up using the right projection (basically the coordinate system – turns out things look a lot nicer if you pretend the world is flat but that needs conversions…).

I could spend a lot of time talking about all the challenges and the things I’ve learnt, but as I’m still in the process I thought I’d just share a number of good sources for map data. The list is entirely biased towards Roman map layers and the UK, but that’s what I’m focussed on right now. Doubtless I’ll have to tackle Austrian maps soon, but in the meantime…

UK maps:

  • DiGMapGB: the Digital Geological Map of Great Britain, this set of data is available from the British Geological Survey, and includes both bedrock and superficial deposits. It is free to download for commercial, research and public use, though you have to acknowledge use.
  • Mineral data: the British Geological Survey also has a lot of mineral data, as well as a patchy map of pits, mines and quarries in the UK. Unfortunately none of this data is free, as far as I can ascertain.
  • Edina Digimap: only good for students/staff at academic institutions, this site will give you (sometimes limited) access to a massive amount of data. Historic maps (i.e., the early UK maps), Geology maps (from the British Geological Survey, but in an easy-to-access way), and Modern maps including boundaries (often in irritating formats that need conversion or cause headaches). They also make you register individually, which is a bit of a pain, but this is the definitely the place to go.

Roman maps:

  • Pleiades: If you are feelingextremely brave and technical, Pleiades produce a CSV file (text) every day with the raw dump of their data. This contains a lot of mapped sites, so you could try importing that. I haven’t! They also produce a KML file every day, but as this seems to requirescript (erk!) to convert this to a file I can use in ArcMap I haven’t tried this (yet). However, I suspect this is the best quality and density of information for Roman sites.
  • Ancient World Mapping Centre: The data here is great, and easily accessible in .shp format (for us ArcMap users!).
  • Digital Atlas of Roman and Medieval Civilisations: this rather retro looking site is in fact the place holder for some good quality data. You need to open ArcMap and connect to their map servers to download the data, but the DARMC website walks you through it easily enough.

All the Roman data is free, using a CC licence of one form or another. You do have to hand it to the digital classicists, they do work hard and produce some great data!

Edit: And if you’re interested in other historical map data, including for the US, check out this handy list of map data at Historical GIS and Clearinghouse Forum.

7th Experimental Archaeology Conference

Things have been, and probably will continue to be quiet here, as I’m using all my blogging brain over at the Experimental Archaeology Conference Website.

With support from Roeland Paardekooper, I’ve written and currently run the Conference’s website, which contains the archive of abstracts from all of the past conferences, as well as a substantial number of the academic posters (in PDF form) which have been presented at the conference.

The aim of the website is to provide a single, permanent place for information on the conference, which is running annually, as well as an archive of information from all past conferences. This is particularly important because material presented at the conference is not always published immediately, and with many contributors coming from museums, craft and public engagement contexts there is often little time for publication. We hope that the website, visible as it is on web searches, will enable other researchers to at least find out when someone is working on similar subjects or material and help them reach fellow researchers.

As you can imagine, developing the content and keeping the website up to date on the approach to the conference takes a lot of time, so you likely won’t see me back here for a few weeks! Do pop over and visit the website, as we will have an online poster session with downloadable copies of the posters from this year’s conference.

The 8th Experimental Archaeology Conference will run in early-mid January 2014, and you can follow updates on the conference on the Twitter stream I also manage.

Analysis work at the Earth Sciences Department, Fribourg, Switzerland

I’ve been extremely lucky to have the opportunity to use the Phillips 2400 wavelength dispersive x-ray fluorescence spectrometer (WD-XRF) at the Earth Sciences Department in Fribourg, Switzerland. Consequently I’ve been extremely quiet here, as I’ve been in the labs non-stop for the last few months preparing my samples.

Here at the Institute we prepare our samples for analysis by crushing them and pressing the resulting powder into a homogeneous pellet. Considering I have more than two hundred samples of archaeological iron slag, ore and ceramic this took a considerable amount of time! Thankfully the analysis work in Switzerland went very smoothly, and the results from the pellets of certified reference materials suggest that the WD-XRF and the associated analytical software should provide me with an accurate analysis of the chemical composition of my samples.

Whilst I still have microscopy and some targeted analysis to undertake, I now have a really substantial dataset to start investigating which is looking very exciting, much more than the actual process of analysis itself! After all the hundreds of hours spent crushing, grinding, milling and pressing the samples I spent just four days putting them through the WD-XRF, which for most of the time is a large, noisy, motionless grey box. Hopefully I will be able to get back to writing in the near future, but in the mean time here is a video of the only exciting thing that happened during the analytical process; the automated sample changer swapping out samples!

By findsandfeatures Posted in Research