Quantcast
Channel: 100% Solutions: GIS
Viewing all 5792 articles
Browse latest View live

Interactive map shows the global refugee crisis like you've never seen it before

$
0
0

The magnitude of the global refugee crisis can be hard to grasp, especially when numbers and stats can read so stale. But a new map turns data into an engaging, hyper-visual depiction of refugee movement.  The map, created by researchers at Carnegie Mellon University's Community Robotics, Education, and Technology Empowerment (CREATE) Lab, shows the movement of refugees around the world between 2000 and 2015. The map makes the crisis easier to engage with, putting particular attention on Afghanistan, South Sudan, Somalia, and Syria.   SEE ALSO: This refugee aid app was a passion project. Now hundreds of nonprofits rely on it. The map was created using specialized technology developed by the CREATE Lab, with data compiled by the U.N. Refugee Agency (UNHCR). The project plays like a video along a 15-year timeline; one yellow dot represents about 17 refugees leaving a country, while one red dot represents about 17 refugees arriving somewhere else. As each conflict or crisis erupts in a region, a series of yellow dots floods out of the area, eventually changing color to as they "settle" in another country. Users can zoom in on specific locations to get a more granular look at refugee displacement in those regions. You can also scroll back and forth between years to see the effects of conflict and disaster unfold. "With these moving maps that we can now create ... this highly interactive visual animation moves people beyond bias, enabling viewers to achieve common ground, fast," says Illah Nourbakhsh, CREATE Lab director. "After all, the visual cortex is the very fastest way of delivering complex data to our minds." "This highly interactive visual animation moves people beyond bias." While it's largely known that refugees often flee conflict in developing nations, the map dispels myths about refugee resettlement, showing that most refugees relocate to neighboring developing countries rather than Western nations. In fact, developing countries host nearly 90 percent of the world's refugees. There are an estimated 21.3 million refugees in the world today, with 53 percent hailing from just three countries — Somalia, Afghanistan, and Syria.   The top five countries hosting the most refugees around the world are Turkey, Pakistan, Lebanon, Iran, and Ethiopia. All of these countries are considered developing nations by the U.N.  The visualization is part of the Explorables project, a platform by CREATE Labs helping to make big data more digestible. To do this, the project relies on maps to make global crises and complex data more accessible. So far, researchers have created maps addressing global income inequality and fracking earthquakes in the U.S., among others. The researchers hope this visual approach to depicting some of the world's biggest problems will help create more interest in crises affecting vulnerable communities around the globe. The interactivity, Nourbakhsh says, helps viewers "become intimate with the data." "Instead of mere observers, they become participants in making meaning of data," he says. "Then, they can work to answer the questions we really should be asking: Why does our world view allow this sort of harm to exist? How can we work together to change the status quo for the better?"

Amazon’s New Customer

$
0
0

Back in 2006, when the iPhone was a mere rumor, Palm CEO Ed Colligan was asked if he was worried: “We’ve learned and struggled for a few years here figuring out how to make a decent phone,” he said. “PC guys are not going to just figure this out. They’re not going to just walk in.” What if Steve Jobs’ company did bring an iPod phone to market? Well, it would probably use WiFi technology and could be distributed through the Apple stores and not the carriers like Verizon or Cingular, Colligan theorized. I was reminded of this quote after Amazon announced an agreement to buy Whole Foods for $13.7 billion; after all, it was only two years ago that Whole Foods founder and CEO John Mackey predicted that groceries would be Amazon’s Waterloo. And while Colligan’s prediction was far worse — Apple simply left Palm in the dust, unable to compete — it is Mackey who has to call Amazon founder and CEO Jeff Bezos, the Napoleon of this little morality play, boss. The similarities go deeper, though: both Colligan and Mackey made the same analytical mistakes: they mis-understood their opponents goals, strategies, and tactics. This is particularly easy to grok in the case of Colligan and the iPhone: Apple’s goal was not to build a phone but to build an even more personal computer; their strategy was not to add on functionality to a phone but to reduce the phone to an app; and their tactics were not to duplicate the carriers but to leverage their connection with customers to gain concessions from them. Mackey’s misunderstanding was more subtle, and more profound: while the iPhone may be the most successful product of all time, Amazon and Jeff Bezos have their sights on being the most dominant company of all time. Start there, and this purchase makes all kinds of sense. If you don’t understand a company’s goals, how can you know what the strategies and tactics will be? Unfortunately, many companies, particularly the most ambitious, aren’t as explicit as you might like. In the case of Amazon, the company stated in its 1997 S-1: Amazon.com’s objective is to be the leading online retailer of information-based products and services, with an initial focus on books. Even if you picked up on the fact that books were only step one (which most people at the time did not), it was hard to imagine just how all-encompassing Amazon.com would soon become; within a few years Amazon’s updated mission statement reflected the reality of the company’s e-commerce ambitions: Our vision is to be earth’s most customer centric company; to build a place where people can come to find and discover anything they might want to buy online. “Anything they might want to buy online” was pretty broad; the advent of Amazon Web Services a few years later showed it wasn’t broad enough, and a few years ago Amazon reduced its stated goal to just that first clause: We seek to be Earth’s most customer-centric company. There are no more bounds, and I don’t think that is an accident. As I put it on a podcast a few months ago, Amazon’s goal is to take a cut of all economic activity. This, then, is the mistake Mackey made: while he rightly understood that Amazon was going to do everything possible to win in groceries — the category accounts for about 20% of consumer spending — he presumed that the effort would be limited to e-commerce. E-commerce, though, is a tactic; indeed, when it comes to Amazon’s current approach, it doesn’t even rise to strategy. As you might expect, given a goal as audacious as “taking a cut of all economic activity”, Amazon has several different strategies. The key to the enterprise is AWS: if it is better to build an Internet-enabled business on the public cloud, and if all businesses will soon be Internet-enabled businesses, it follows that AWS is well-placed to take a cut of all business activity. On the consumer side the key is Prime. While Amazon has long pursued a dominant strategy in retail — superior cost and superior selection — it is difficult to build sustainable differentiation on these factors alone. After all, another retailer is only a click away. This, though, is the brilliance of Prime: thanks to its reliability and convenience (two days shipping, sometimes faster!), plus human fallibility when it comes to considering sunk costs (you’ve already paid $99!), why even bother looking anywhere else? With Prime Amazon has created a powerful moat around consumer goods that does not depend on simply having the lowest price, because Prime customers don’t even bother to check. This, though, is why groceries is a strategic hole: not only is it the largest retail category, it is the most persistent opportunity for other retailers to gain access to Prime members and remind them there are alternatives. That is why Amazon has been so determined in the space: AmazonFresh launched a decade ago, and unlike other Amazon experiments, has continued to receive funding along with other rumored initiatives like convenience store and grocery pick-ups. Amazon simply hasn’t been able to figure out the right tactics. To understand why groceries are such a challenge look at how they differ from books, Amazon’s first product: There far more books than can ever fit in physical store, which means an e-commerce site can win on selection; in comparison, there simply aren’t that many grocery items (a typical grocery store will have between 30,000 and 50,000 SKUs) When you order a book, you know exactly what you are getting: a book from Amazon is the same as a book from a local bookstore; groceries, on the other hand, can vary in quality not just store-to-store but, particularly in the case of perishable goods, item-to-item Books can be stored in a centralized warehouse indefinitely; perishable groceries can only be stored for a limited amount of time and degrade in quality during transit As Mackey surely understood, this meant that AmazonFresh was at a cost disadvantage to physical grocers as well: in order to be competitive AmazonFresh needed to stock a lot of perishable items; however, as long as AmazonFresh was not operating at meaningful scale a huge number of those perishable items would spoil. And, given the inherent local nature of groceries, scale needed to be achieved not on a national basis but a city one. Groceries are a fundamental different problem that need a fundamentally different solution; what is so brilliant about this deal, though, is that it solves the problem in a fundamentally Amazonian way. Last year in The Amazon Tax I explained how the different parts of the company — like AWS and Prime — were on a conceptual level more similar than you might think, and that said concepts were rooted in the very structure of Amazon itself. The best example is AWS, which offered server functionality as “primitives”, giving maximum flexibility for developers to build on top of: The “primitives” model modularized Amazon’s infrastructure, effectively transforming raw data center components into storage, computing, databases, etc. which could be used on an ad-hoc basis not only by Amazon’s internal teams but also outside developers: This AWS layer in the middle has several key characteristics: AWS has massive fixed costs but benefits tremendously from economies of scale The cost to build AWS was justified because the first and best customer is Amazon’s e-commerce business AWS’s focus on “primitives” meant it could be sold as-is to developers beyond Amazon, increasing the returns to scale and, by extension, deepening AWS’ moat This last point was a win-win: developers would have access to enterprise-level computing resources with zero up-front investment; Amazon, meanwhile, would get that much more scale for a set of products for which they would be the first and best customer. As I detailed in that article, this exact same framework applies to Amazon.com: Prime is a super experience with superior prices and superior selection, and it too feeds into a scale play. The result is a business that looks like this: That is, of course, the same structure as AWS — and it shares similar characteristics: E-commerce distribution has massive fixed costs but benefits tremendously from economies of scale The cost to build-out Amazon’s fulfillment centers was justified because the first and best customer is Amazon’s e-commerce business That last bullet point may seem odd, but in fact 40% of Amazon’s sales (on a unit basis) are sold by 3rd-party merchants; most of these merchants leverage Fulfilled-by-Amazon, which means their goods are stored in Amazon’s fulfillment centers and covered by Prime. This increases the return to scale for Amazon’s fulfillment centers, increases the value of Prime, and deepens Amazon’s moat As I noted in that piece, you can see the outline of similar efforts in logistics: Amazon is building out a delivery network with itself as the first-and-best customer; in the long run it seems obvious said logistics services will be exposed as a platform. This, though, is what was missing from Amazon’s grocery efforts: there was no first-and-best customer. Absent that, and given all the limitations of groceries, AmazonFresh was doomed to be eternally sub-scale. This is the key to understanding the purchase of Whole Foods: to the outside it may seem that Amazon is buying a retailer. The truth, though, is that Amazon is buying a customer — the first-and-best customer that will instantly bring its grocery efforts to scale. Today, all of the logistics that go into a Whole Foods store are for the purpose of stocking physical shelves: the entire operation is integrated. What I expect Amazon to do over the next few years is transform the Whole Foods supply chain into a service architecture based on primitives: meat, fruit, vegetables, baked goods, non-perishables (Whole Foods’ outsized reliance on store brands is something that I’m sure was very attractive to Amazon). What will make this massive investment worth it, though, is that there will be a guaranteed customer: Whole Foods Markets. In the long run, physical grocery stores will be only one of the Amazon Grocery Services’ customers: obviously a home delivery service will be another, and it will be far more efficient than a company like Instacart trying to layer on top of Whole Foods’ current integrated model. I suspect Amazon’s ambitions stretch further, though: Amazon Grocery Services will be well-placed to start supplying restaurants too, gaining Amazon access to another big cut of economic activity. It is the AWS model, which is to say it is the Amazon model, but like AWS, the key to profitability is having a first-and-best customer able to utilize the massive investment necessary to build the service out in the first place. I said at the beginning that Mackey mis-understood Amazon’s goals, strategies, and tactics, and while that is true, the bigger error was in misunderstanding Amazon itself: unlike Whole Foods Amazon has no desire to be a grocer, and contrary to conventional wisdom the company is not even a retailer. At its core Amazon is a services provider enabled — and protected — by scale. Indeed, to the extent Waterloo is a valid analogy, Amazon is much more akin to the British Empire, and there is now one less obstacle to sitting astride all aspects of the economy.

Europe plans to have drone rules in place by 2019

$
0
0

The European Union doesn’t want to fall behind on drones, which could bring in billions in euros to its member states. The EU’s commission for developing a European air traffic control system said on Friday that its goal is to have rules for the safe operation of autonomous drones ready by 2019. Ultimately, the EU says that it wants to create a traffic management system for unmanned drones that’s similar to air traffic control for manned planes. But in the near term, the commission says that in the next two years, it plans to have a European registration system for drones, a solution for keeping drones from flying over prohibited areas and remote identification of unmanned aircraft. That would put the EU at a similar pace with the U.S., where the Federal Aviation Administration created a new committee in March to work on a remote identification system for drones. The U.S. currently already requires commercial pilots to register their aircraft, but last month a U.S. federal court nixed the registration rule for non-commercial pilots, stating the FAA didn’t have authority to regulate hobbyists. But for remote identification of drones to work, which will be important for both security and managing traffic, aircraft will likely need to be registered somewhere. Or else drones will operate like an unmarked car on the road, only there’s no driver and no one to pull over if there’s a problem. The U.S. has rules for not flying over restricted airspace, but those rules aren’t required to be baked into the drone software, like in the form of geo-fencing. Rather, in the U.S., pilots are instructed to avoid flying over restricted areas, though many manufacturers include geo-fencing in their drones anyway. It’s unlikely the U.S. will cement remote identification rules without a registration system for non-commercial drones, since law enforcement has voiced concerns about not being able to identify pilots — whether or not the drone is being used to make money or flown for fun. If the EU’s timeline is realized, Europe might skip ahead of the United States. The EU says its motivation for putting a due date on its rules is to position Europe as a world leader for emerging drone technology. Currently EU-wide drone rules only regulate drones that weigh at least 330 lbs., and member states are tasked with making their own national rules. “Such fragmentation hampers the development of new products, the swift introduction of technologies and may also create safety risks,” reads a statement on the new drone regulation timeline from the European Union. The addressable global market for drone services, like using drones to deliver packages or inspect construction sites, is valued at over $127 billion, according to a 2016 study from PwC.

Innovative Postage Stamp Celebrates Upcoming Total Solar Eclipse

$
0
0

The total solar eclipse pictured on a postage stamp being released today is a remarkably good representation of what hordes of eclipse watchers will see with their own eyes this August. So says astrophysicist Fred Espenak, who snapped the photos featured on the new stamp of the Sun and Moon. The new stamp from the U.S. Postal Service (USPS) commemorates the 21 August total solar eclipse that will be viewable, weather permitting, along about a 110-kilometer-wide “path of totality” across the country from Oregon to South Carolina. This is the first time since 1979 that a total solar eclipse will be visible on the U.S. mainland. The postal service is rolling out the 49-cent “Forever” stamp, always worth the 1-ounce price for First Class Mail, on the cusp of the 2017 summer solstice. To create the extraordinary stamp, the stamp’s designer used a composite image provided by Espenak of an earlier eclipse. Espenak’s image digitally stitched together 22 separate photographs that he took of a 2006 total solar eclipse in Libya. Combining exposures taken at different camera shutter speeds and fine-tuning and filtering them on a computer brought out details of the solar atmosphere that otherwise would not be visible on a stamp, according to Espenak, an eclipse expert, astrophysicist, and photographer. Espenak, who maintains NASA’s official eclipse website, has been dubbed by some as “Mr. Eclipse.” The image that resulted from those sophisticated techniques shows exceptionally fine gradations of light and dark, so it comes close to representing what the Sun’s corona looks like to the naked eye, explained Espenak, who retired from NASA’s Goddard Space Flight Center in Greenbelt, Md., after working there as an astrophysicist for more than 30 years. “The inner part of the corona is 1000 times brighter than the outer corona [that is] just a Sun’s diameter away,” Mr. Eclipse said. “The eye can see that beautifully but photographs don’t reveal that” unless they undergo special processing. But that’s not all that’s exceptional about this stamp. If you touch the eclipse image on the stamp, the heat from your finger temporarily reveals an image of the full Moon (also shot by Espenak) covering the disk of the Sun. This effect is made possible with thermochromic ink that changes the stamp’s look in response to temperature, the first time USPS has used this technique. USPS encourages postage stamp art directors to think of new approaches to subject matter that can enhance the stamp program, according to Antonio Alcalá, owner and creative director of Studio A in Alexandria, Va., who designed this stamp. “I believe a primary experience of a solar eclipse is the rapid transition from daylight to darkness to daylight again,” Alcalá told Eos. “Having seen thermochromic printing a few times in the past, I thought this technique might be suitable for conveying this general idea.” Contributing as well to the visual impact of these stamps, the postal service is printing them in a four-color process. This printing method achieves a richer black because less light reflects from the sheet of paper and allows for a greater range of tonality, Alcalá explained. The flip side of each pane with 16 of the new stamps shows a map of the United States crossed by the path of totality and gives the times of the total solar eclipse at each location specified on the map from Salem, Ore., to Charleston, S.C. USPS issued a much more traditional eclipse stamp in 2000. Other countries also have issued eclipse stamps in the past, including Mexico, Zimbabwe, and Libya. Espenak told Eos that, sure, he’s thrilled about his images being used on the stamp, but he’s also excited to inform and educate millions of people about an astronomical phenomenon. A solar eclipse is “one of the most spectacular ways” to reach out to the public and get them interested in science and technology, he said. “The United States in particular is suffering from severe science illiteracy right now,” Espenak said. “It’s heartbreaking seeing the way the leadership in this country is taking us away from science.” Seeing a total solar eclipse “is a life-changing event,” said Espenak, who looks forward to the August incident as his 28th such experience, noting that he has “only” spent about 70 minutes overall in totality. Espenak said that nothing—not videos, books, or photos—can prepare somebody for the sensation of witnessing a total eclipse in person. “When that moon shadow comes over the horizon and suddenly sweeps over [the Sun], you plunge from bright daylight into twilight in seconds,” he said. “You feel this event in the pit of your stomach, you feel it on the hair on your arms and the back of your neck. You have a visceral reaction that something is very different, something is very wrong, even.” Because “it seems apocalyptic” with the Sun’s bright disk briefly gone, “you can easily empathize with people thousands of years ago who didn’t understand what was happening,” he added. Espenak took the video below of the 1 August 2008 total solar eclipse from Jinta, China. “Consider this an appetizer for the 2017 total solar eclipse through the USA,” he writes in notes accompanying the video. Total Solar Eclipse of 2008 – 1 from Fred Espenak on Vimeo. Espenak caught his first total solar eclipse in 1970 when, at age 18 with a newly issued driver’s license, his parents allowed him to drive the family’s blue Chevy Biscayne, unchaperoned, about 750 kilometers from Staten Island, N.Y., to Windsor, N.C., to get in the path of the eclipse. He checked into a motel, and the next morning on eclipse day, the area behind the motel “was a field of tripods. Everybody was there in that motel to see the eclipse. It was like a big party there.” “I’m really wound up on eclipse day,” said Espenak, who leads eclipse tours. There are so many cameras to set up and align that must be “coordinated almost like a ballet,” he said. On eclipse day, he’s not very social, cordoning off his staging area with police tape as a visual warning to others not to disturb him or his equipment. “A number of TV stations have said they wanted to interview me during the eclipse, and I said, ‘Are you out of your mind?’” Espenak said. “You can’t pay me enough, not during the eclipse.” The August eclipse will be another nerve-wracking time for Espenak. A few days prior, he’ll talk about the eclipse at an astronomy convention in Casper, Wyo., which lies in the path of totality. However, if the forecast calls for clouds in Casper, he will be ready to drive an SUV full of photo equipment as far as he can along that path, the day before the eclipse, looking for better weather. That makes sense for Espenak, who said that seeing total eclipses “has been the biggest thrill of my life.” —Randy Showstack (@RandyShowstack), Staff Writer

Log In or Sign Up to View

$
0
0
NoticeYou must log in to continue.Log into FacebookCreate New Account

Best Practices for Creating Bilingual Apps

$
0
0

Offering your web app in multiple languages can increase your audience and the overall impact of the information. In this blog, we will show two examples of configuring apps to support a bilingual audience. Example 1: Configuring two apps In the first example, we will be using the Basic Viewer template to showcase a map.  For this project, our audience is both French and English speakers.  The map’s pop-up is configured with a custom attribute display explaining the data.  These steps start after your initial map and app have been created. 1. Identify data to translate.  First, let’s start by identifying which data Esri translates as part of the localization process and what you will need to handle.  Esri will display all application elements such as search, element labels, and tooltips in the browser or ArcGIS Organization locale.  Green boxes show what Esri translates and red boxes highlight what is driven by web map content.  In this example, the web map and data is in English and the browser locale was set to French. 2. Translate data and create the second map. The pop-up, app title and layer titles need to be updated in the French version.  Create a second map and update all required data to fully translate your map and app into the language you need.  Since I am using a custom attribute pop-up I needed to translate it. 3. Configure the second application Configure and publish your app.  While configuring your app you will be able to change the title, subtitles and additional information to your language of choice.   Optional tips: To enable users to easily choose the appropriate language I have added a splash screen to both apps.  The splash contains a link to an application that has a web map configured in French. In addition to the splash screen, URL parameters can be set to force the locale to be in French. In the splash screen hyperlink a URL parameter can be added to the application URL to force the locale to be set.  In this example, “&locale=fr” was added to the end of the French application.  For more information on URL Parameters visit the help topic. 4.  Share your app Review and test your app in both languages.  Verify everything is correctly translated. I generated the translations from Google Translate for this example.  You may want to have a fluent speaker verify your translations. Make any configuration or data adjustment. Then share your app with your audience. Example 2: Configuring one app to support two languages This example will illustrate how to configure one app to support two languages.  For this project, I want to collect data from the community.  The community I’m targeting contains both English and Spanish speakers.  I’d like to collect all the information in one application.  This example will use the GeoForm template. 1.  Configure data to support two languages In the GeoForm application, I have four fields I will use to collect data. One field contains a domain to drive the drop-down options in the GeoForm app.  I created the domain values to appear in both Spanish and English.  A bit of planning or updating may be necessary to update your data to contain both Spanish and English values. These domains were created and published from ArcGIS Pro. 2.  Configure the application During the configuration process, you can assign a title and short instructions in both languages. In most cases you should adjust the layer names in the web map, but the GeoForm allows you to change the field name labels in the builder.  This is a simple way to add a second language to the field name. 3.  Share your app Test your app in both languages to ensure everything thing which needs to be offered in two languages by the author is and the rest is localized.  An easy way to test is to use the URL parameter discussed earlier. Refine any details as needed. Then share your app. If you are going to provide links to this app from a website it would also be a good idea to utilize the locale parameter to ensure that the app UI is in the correct language for your target audience.  Here is English version of the GeoForm example and here is the Spanish.

July18-22 2017

$
0
0
Wed 19th Thu 20th Fri 21st 09:30 10:30 11:30 12:30 13:30 14:30 15:30 16:30 17:30 Caquot Opening plenary Keynote 1 Keynote 2 Coffee break Lunch break Coffee break Cauchy Coffee break GDAL 2.2: what's new? Advanced geospatial technologies: The new powerful GRASS GIS 7.2 release What's new in Orfeo ToolBox 6.0? Lunch break Assessment of the digital openness of the geospatial dimension of the world Inspiring European data providers through open source geospatial software Assess the OSM data quality starting from contribution history Coffee break Serving earth observation data with GeoServer: addressing real world requirements A cloudfree Europe with Sentinel-2 Standardized access and processing of Earth Observation data within a regional data middleware system Navier Coffee break Four-letter word Remonterletemps.ign.fr : the geospatial time machine built upon open-source components Adressing referencing issues with new standards, and introduction to Apache Spatial Information System (SIS) Lunch break State of the art of WebGL map viewers CARTO 2017 EOxC – a modern web catalog client Coffee break A series of unfortunate maps Magrit : a new thematic cartography application Make It Rain with Mapbox GL Picard Coffee break Understanding the Flexibility of Open Source Examining The Spatial Dynamics of Economic Globalisation: A Tale of using Open Source Technologies Struggling with the Open Community Lunch break Serving successfully millions of 3D objects in a browser Melown 3D mapping stack Let's integrate BIM and 3D GIS on top of FOSS4G! Coffee break Mobile application for 3D real-time visualization for Outdoor sports competitions (LIS3D) Albion : Geological modelling Software Mapping Learning: when geography meets machine learning Bienvenüe Coffee break Urban data collection using a bike mobile system with a foss architecture High resolution global gridded data for use in population studies Building a geographic data repository for urban research with free software – learning from observatorio.cedeus.cl Lunch break The extraction of indoor building information from BIM to OGC IndoorGML Can reconstructed landsurface temperature data from space predict a west nile virus outbreak ? Processing big remote sensing data for fast flood detection in a distributed computing environmen Coffee break An integrated approach for linked data browsing GEOYASGUI: the geosparql query editor and result set visualizer A WebGIS for the knowledge and conservation of the historical buildings in Sardinia (Italy) 09:30 10:30 11:30 12:30 13:30 14:30 15:30 16:30 17:30 Caquot Keynote 3 Keynote 4 Coffee break Lunch break Coffee break Cauchy Coffee break The importance of user communities in the uptake of open Copernicus data and information CTEP: satellite data exploitation platform developed entirely with FOSS software pyroSAR – a Python framework for large-scale SAR satellite data processing Lunch break INSPIRE and Open Source Geospatial Solutions: adding missing pieces to the puzzle? Implementing INSPIRE view and download services – transition to FOSS4G OGC/INSPIRE versus OpenData/LinkedData Coffee break What's up with diversity? WebWorldWind: status and perspective 2 years after its introduction at FOSS4G in Como The theory of Space Syntax and its application using open source tools and libraries Navier Coffee break Creating semantic graphs on top of geospatial databases D3.js in postgis OpenMapTiles: ready to use OpenStreetMap vector tiles Lunch break QGIS Server 3: refactoring and enhancement Let’s share GIS much quicker an SLD+OWSContext extension for GeoPackage to migrate between GIS platforms Coffee break Making geospatial resources web-ready: improving the discovery, access and use of SDIs IGEO geospatial platform MapStore 2, modern mashups with OL3, Leaflet and React Picard Coffee break 10 years of OSGeo @ GsoC: what's next? OSGeo Branding and Website Reboot Open Geoscience and OSGeo Lunch break Introducing hex-utils: an hexagonal raster tool-kit In the service of democracy: allocating expat voters to polling stations using FOSS GIS Estimating public transit "real-time" locations based on time-table data Coffee break New ASIG Geoportal – central place for spatial data and services in Albania Introducing the vehicle routing open-source optimization machine (VROOM) Shortest Path search in your Database and more with pgRouting Bienvenüe Coffee break Concept and implementation of an architecture for the immediate provision of geodata in disaster management Screening of environmental impact of pollution with the qgis plugin envifate Online analysis of meteorological and climate geospatial datasets for northern eurasia environmental studies Lunch break FOSS4G as a key building block for case-based learning in geographic information education UN Open GIS capacity building Migrate: a foss web mapping application for educating and raising awareness about migration flows in europe Coffee break A new digital image correlation software for displacements field measurement in structural applications Extending the scalability of istsos within the 4onse project Extension of rtklib for the calculation and validation of protection levels 09:30 10:30 11:30 12:30 13:30 14:30 15:30 16:30 17:30 Caquot Keynote 5 Keynote 6 Coffee break Lunch break Coffee break Closing plenary Cauchy Coffee break State of GeoServer GeoNetwork: State of the Art GeoNode, the Open Source geospatial CMS Lunch break Advanced features in PyWPS 4.0.0 Mapbender3 Presentation and Status Report ZOO-Project 1.7.0: What 's new on the Open WPS Platform Coffee break GeoMapFish, an Open Source WebGIS Project Status Report Lizmap Feature Frenzy Introducing mappyfile: a Python library for MapServer Navier Coffee break Hotspot Analysis: an experimental Python plugin to enable LISA mapping into QGIS LiDAR analysis for hazard mapping and forestry management using JGrassTools and gvSIG Data Processing with QGIS 3 Lunch break Angular2 Geo-Apps with YAGA Pirate Maps: Experiments with portable maps on the Raspberry Pi Taking advantage of OGC and HTML5: offline web editing Coffee break Open Data and FOSS4G to improve resilience for Comoros A digital approach for forest and land use planning build on WMS, WPS and WRSm Query support for GMZ Picard Coffee break Public Transport in GraphHopper The Geopaparazzi project: state of the art QViz: An open source tool for interactive stakeholder participation using large scale tangible user interfaces Lunch break Mobile application for 3D real-time visualization for Outdoor sports competitions (LIS3D) Aquadrone : geo-tracking and collecting environmental data from an underwater remotely operated vehicle FREEWAT platform for integrated water management Coffee break Fentrol.hu, experiences of the open aerial photo archive geoportal.xyz multi-infrastructures, multi-customers) Building a Table Joining Like Service with Web Processing Services Bienvenüe Coffee break Integrating MCDM methods to gis through a dedicated FOSS application Multi-modal remote sensing data fusion framework Spatial inequality in the accessibility to hospitals in Greece Lunch break Application of a pattern recognition algorithm for single tree detection from lidar data Comparative analysis of 3d point clouds generated from a freeware and terrestrial laser scanner 3D model generation using oblique images acquired by UAV Coffee break A new strategy for dsm generation from satellite imagery with a foss4g: “date” results analysis on the isprs benchmark Detecting stressed and pest-affected trees in aerial photos through machine learning: a proof of concept

7-year mapping project adds Indigenous communities to Google Earth

$
0
0

More than 3,000 Indigenous communities in Canada have been added to Google Maps and Google Earth — and it's about time, Aboriginal people say. "It's important to me because there are so many Indigenous groups across the country and to not see them as an important fabric of a base map, just to not be recognized, it's insulting," said Steve DeRoy, a member of the Ebb and Flow First Nation in Manitoba, and one of the researchers/cartographers working on this project. "We are in our 150th year [of] Canada being a country; it's just one step closer to reconciliation." For the last seven years, DeRoy and other experts have worked with Natural Resources Canada and Indigenous communities to compile coordinates and mapping information. The updated sites include First Nations reserve lands, but also treaty settlement lands belonging to Indigenous communities. One of these is the Long Plain First Nations urban reserve in Winnipeg's west end, acquired in 2006, fulfilling in part Canada's outstanding treaty land entitlement. Called the Madison Reserve, it was recognized as a reserve in 2013 and features a Petro-Canada gas station and office complex. "What you have is a number of Indigenous communities — some of them have multiple reserve parcels, some of them have single reserves and so 3,000 is the total number of reserve lands and settlement lands that we have on the map," DeRoy said. ​"This is a big moment, this is an opportunity. We actually kicked this project off seven years ago, so it's been a relatively long process and one that's definitely worth doing thoroughly and doing collaboratively," Google Canada's Alexandra Hunnings said. "We've had workshops across the country and the idea is to empower Indigenous communities with the ability to build their own maps to reflect their own communities on our platforms. We're fortunate to have a platform that many people around the world use and we want Canada and what makes us Canadian reflected on those platforms." 'It's unfortunate that Indigenous people have been excluded from the maps and it's taken a long time just to have that recognition' - Steve DeRoy, cartographer Canada has more than 1.4 million people who self-identify as First Nations, Métis or Inuit. There are 600 bands living on 3,100 reserves and urban centres across the country. DeRoy acknowledged that mapping can be political, especially decisions about what is included and what is not. "It's unfortunate that Indigenous people have been excluded from the maps and it's taken a long time just to have that recognition — just to be showing on the maps," he said. "It's one step closer to reconciliation and having Google be able to put this information into the world. It's fantastic. It's a great feeling." If Indigenous communities want to add or update information about their lands such as roads, addresses or businesses, their government can contribute data on the Base Map Partner Program. Individual community members can also use the Send Feedback tool to add and edit essential information on Google Maps. First Nations learn to map territories using Google Earth Google maps Nunavut hamlet by tricycle

🌟 Introducing Dash 🌟

$
0
0
Dash is a Open Source Python library for creating reactive, Web-based applications. Dash started as a public proof-of-concept on GitHub 2 years ago. We kept this prototype online, but subsequent work on Dash occurred behind closed doors. We used feedback from private trials at banks, labs, and data science teams to guide the product forward. Today, we’re excited to announce the first public release of Dash that is both enterprise-ready and a first-class member of Plotly’s open-source tools. Dash can be downloaded today from Python’s package manager with pip install dash — it’s entirely open-source and MIT licensed. You’ll find a getting started guide here and the Dash code on GitHub here. Dash is a user interface library for creating analytical web applications. Those who use Python for data analysis, data exploration, visualization, modelling, instrument control, and reporting will find immediate use for Dash. Dash makes it dead-simple to build a GUI around your data analysis code. Here’s a 43-line example of a Dash App that ties a Dropdown to a D3.js Plotly Graph. As the user selects a value in the Dropdown, the application code dynamically exports data from Google Finance into a Pandas DataFrame. This app was written in just 43 lines of code (view the source). Simple. Dash app code is declarative and reactive, which makes it easy to build complex apps that contain many interactive elements. Here’s an example with 5 inputs, 3 outputs, and cross filtering. This app was composed in just 160 lines of code, all of which were Python. Every aesthetic element of the app is customizable: The sizing, the positioning, the colors, the fonts. Dash apps are built and published in the Web, so the full power of CSS is available. Here’s an example of a highly customized, interactive Dash report app, in the brand and style of a Goldman Sachs report. While Dash apps are viewed in the web browser, you don’t have to write any Javascript or HTML. Dash provides a Python interface to a rich set of interactive web-based components. import dash_core_components as dccdcc.Slider(value=4, min=-10, max=20, step=0.5, labels={-5: '-5 Degrees', 0: '0', 10: '10 Degrees'})Dash provides a simple reactive decorator for binding your custom data analysis code to your Dash user interface. @dash_app.callback(Output('graph-id', 'figure'), [Input('slider-id', 'value')])def your_data_analysis_function(new_slider_value): new_figure = your_compute_figure_function(new_slider_value) return new_figureWhen an input element changes (e.g. when you select an item in the dropdown or drag a slider), Dash’s decorator provides your Python code with the new value of the input. Your Python function can do anything that it wants with this input new value: It could filter a Pandas DataFrame, make a SQL query, run a simulation, perform a calculation, or start an experiment. Dash expects that your function will return a new property of some element in the UI, whether that’s a new graph,a new table, or a new text element. For example, here’s a simple Dash application that updates a text box as you interact with the Graph element. The application code filters data in a Pandas DataFrame based off of the currently selected point. This Dash application displays meta information about drugs as you hover over points in the Graph component. The application code also appends rows to the Table component when elements are added to the multi Dropdown component. component. Through these two abstractions — Python components and reactive functional decorators — Dash abstracts away all of the technologies and protocols that are required to build an interactive web-based application. Dash is simple enough that you can bind a user interface around your Python code in an afternoon. Flask and React Dash applications are web servers running Flask and communicating JSON packets over HTTP requests. Dash’s frontend renders components using React.js, the Javascript user-interface library written and maintained by Facebook. Flask is great. It’s widely adopted by the Python community and deployed in production environments everywhere. The underlying instance of Flask and all of its configurable properties is accessible to Dash app developers. For advanced developers, Dash apps can be extended through the rich set of Flask Plugins as well. React is fantastic too. At Plotly, we’ve rewritten our entire web-platform and our online chart editor with React. One of the incredible things about React is how prolific and talented the community is. The open source React community has published thousands of high quality interactive components, from Dropdowns to Sliders to Calendar Pickers to Interactive Tables. Dash leverages the power of Flask and React, putting them to work for Python data scientists who may not be expert Web programmers. From React.js to Python Dash Components Dash components are Python classes that encode the properties and values of a specific React component and that serialize as JSON. Dash provides a toolset to easily package React components (written in Javascript) as components that can be used in Dash. This toolset uses dynamic programming to automatically generate standard Python classes from annotated React propTypes. The resulting Python classes that represent Dash components are user friendly: They come with automatic argument validation, docstrings, and more. Here’s an example of the dynamically generated argument validation: >>> import dash_core_components as dcc>>> dcc.Dropdown(valu=3)Exception: Unexpected keyword argument `valu`Allowed arguments: id, className, disabled, multi, options, placeholder, valueand an example of the dynamically generated component docstrings: >>> help(dcc.Dropdown)class Dropdown(dash.development.base_component.Component) | A Dropdown component. | Dropdown is an interactive dropdown element for selecting one or more | items. | The values and labels of the dropdown items are specified in the `options` | property and the selected item(s) are specified with the `value` property. | | Use a dropdown when you have many options (more than 5) or when you are | constrained for space. Otherwise, you can use RadioItems or a Checklist, | which have the benefit of showing the users all of the items at once. | | Keyword arguments: | - id (string; optional) | - className (string; optional) | - disabled (boolean; optional): If true, the option is disabled | - multi (boolean; optional): If true, the user can select multiple values | - options (list; optional) | - placeholder (string; optional): The grey, default text shown when no option is selected | - value (string | list; optional): The value of the input. If `multi` is false (the default) | then value is just a string that corresponds to the values | provided in the `options` property. If `multi` is true, then | multiple values can be selected at once, and `value` is an | array of items with values corresponding to those in the | `options` prop. | | Available events: 'changeThe full set of HTML tags, like <div/>, <img/>, <table/> are also rendered dynamically with React and their Python classes are available through the dash_html_component library. A core set of interactive components like Dropdown, Graph, Slider will be maintained by the Dash core team through the dash_core_components library. Both of these libraries use the standard open-source React-to-Dash toolchain that you could use if you were to write your own component library. You’re not tied to using the standard Dash component library. The Dash component libraries are imported separately from the core Dash library. With the React-to-Dash toolchain, you can easily write or port a React.js component into a Python class that can be used in your Dash application. Here’s the tutorial on building your own components. Or, the Dash core team can build one for you. Concurrency — Multi-User Applications The state of a Dash application is stored in the front-end (i.e. the web browser). This allows Dash apps to be used in a multitenant setting: Multiple users can have independent sessions while interacting with a Dash app at the same time. Dash application code is functional: Your application code can read values from the global Python state but it can’t modify them. This functional approach is easy to reason about and easy to test: It’s just inputs and outputs with no side-effects or state. CSS and Default Styles CSS and default styles are kept out of the core library for modularity, independent versioning, and to encourage Dash App developers to customize the look-and-feel of their apps. The Dash core team maintains a core style guide here. Data Visualization Dash ships with a Graph component that renders charts with plotly.js. Plotly.js is a great fit for Dash: it’s declarative, open source, fast, and supports a complete range of scientific, financial, and business charts. Plotly.js is built on top of D3.js (for publication-quality, vectorized image export) and WebGL (for high performance visualization). Dash’s Graph element shares the same syntax as the open-source plotly.py library, so you can easily to switch between the two. Dash’s Graph component hooks into the plotly.js event system, allowing Dash app authors to write applications that respond to hovering, clicking, or selecting points on a Plotly graph. Open Source Repositories You can check out the code yourself across a few repositories: Dash backend: https://github.com/plotly/dash Dash frontend: https://github.com/plotly/dash-renderer Dash core component library: https://github.com/plotly/dash-core-components Dash HTML component library: https://github.com/plotly/dash-html-components Dash component archetype (React-to-Dash toolchain): https://github.com/plotly/dash-components-archetype Dash docs and user guide: https://github.com/plotly/dash-docs, hosted at https://plot.ly/dash Plotly.js — the graphing library used by Dash: https://github.com/plotly/plotly.js Dash is new in the Python ecosystem but the concepts and motivation behind Dash have existed for decades in a variety of different languages and applications. If you’re coming from Excel, then your head is in the right place. Both Dash and Excel use a “reactive” programming model. In Excel, output cells update automatically when input cells change. Any cell can be an output, an input, or both. Input cells aren’t aware of which output cells depend on them, making it easy to add new output cells or chain together a series of cells. Here’s an example Excel “application”: There’s an Excel analogy for Dash. Instead of cells, we have rich web based components like sliders, inputs, dropdowns, and graphs. Instead of writing Excel or VBA script, we’re writing Python code. Here is that same spreadsheet application, rewritten in Dash: app.layout = html.Div([ html.Label('Hours per Day'), dcc.Slider(id='hours', value=5, min=0, max=24, step=1),html.Label('Rate'), dcc.Input(id='rate', value=2, type='number'),html.Label('Amount per Day'), html.Div(id='amount'),html.Label('Amount per Week'), html.Div(id='amount-per-week')])@app.callback(Output('amount', 'children'), [Input('hours', 'value'), Input('rate', 'value')])def compute_amount(hours, rate): return float(hours) * float(rate)@app.callback(Output('amount-per-week', 'children'), [Input('amount', 'children')])def compute_amount(amount): return float(amount) * 7I like this example a lot because Excel still reigns supreme, even in technical computing and quantitative finance. I don’t think that Excel’s dominance is just a matter of technical ability. After all, there are legions of spreadsheet programmers who have learned the nuances of Excel, VBA, and even SQL. It’s more that Excel spreadsheets are frequently easier to share than Python programs, and Excel cells are easier to edit than command line arguments. Yet modelling in Excel has well-known limits: These spreadsheets often outgrow themselves. They become too large or fragile to migrate into a production environment, peer review, test, and maintain. Remember the 2013 pro-austerity Excel typo? I hope that Dash makes it easier for developers to use Python for their data projects. By sharing the same functional and reactive principles, it’s almost as easy to write a Dash app as it is to write an analytical spreadsheet. It’s certainly more powerful and presentable. If you develop in the R programming language, you’re in luck. Shiny is a reactive programming framework for generating web applications in pure R. It’s great! You can even create interactive graphics with Shiny and Plotly’s R library. Dash and Shiny are similar but Dash does not aim to be a replica of Shiny. The idioms and philosophies between Python and R are different enough to warrant a different syntax. The front-ends of Shiny and Dash are written in different libraries from different times. If you program in MATLAB then you may be familiar with MATLAB’s user interface library “GUIDE”. Mathworks was one of the true original innovators in technical computing — GUIDE was written in 2004, 13 years ago! If your data is structured in a database, then you may be using Tableau or one of the other BI tools. Tableau is incredible. They’ve set a new expectation in the industry that end-users should have the autonomy and the tools to be able to explore their organization’s data. They’ve also helped popularize the concepts of “drilling down” and cross-filtering. Dash is complementary to BI tools like these. These tools work great for structured data. But when it comes to data transformation and analytics, it’s hard to beat the breadth and flexibility of programming languages and communities like Python. Dash abstracts away a lot of the complexities in building user interfaces, enabling you to build a beautiful front-end for your your custom data analytics backend. Finally, I’d like to give a shout out to Jupyter widgets. Jupyter provide a really nice widget framework inside their notebook interface. You can add sliders to your graphs in the Jupyter notebooks that you run locally. The widgets in Dash are similar to the widgets in Jupyter. In Jupyter Notebooks, you can add widgets directly alongside your code. In Dash, your controls and application are kept separately from your code. Dash is aimed more towards sharable apps than it is to sharable code and notebooks. You can always mix-and-match the tools, and write your Dash apps in the Jupyter Notebook environment. We’re also big fans of the nteract project, which is really lowering the barrier to entry of Python and Jupyter Notebooks by wrapping up Jupyter Notebook as a desktop application. Plotly is a VC-backed startup. We founded in 2013 and we open sourced our core technology, plotly.js, in 2015 (MIT license). We maintain open source libraries in Python, R, and MATLAB that interface with plotly.js and a web app for creating these charts and connecting them to databases (the connectors are also open source). We provide subscriptions to our our chart hosting and sharing platform and to our chart editing and database querying app. This platform is available on the web (plot.ly) and on-premise. We’re applying a similar model to Dash. Dash is MIT licensed. It’s free to use and to modify. For companies, we’re offering Dash Enterprise, a deployment server for easily publishing and provisioning Dash Apps behind your firewall. Our goal with Dash Enterprise is to make sharing a Dash app internally as easy and secure as possible. No dev-ops required. Dash Enterprise handles the URL routing, the monitoring, the failure handling, the deployment, the versioning, and the package management. Dash Apps deployed with Dash Enterprise can be provisioned through your company’s Active Directory or LDAP user accounts. If you’re using the open source version locally, there are no restrictions. You can manage deployment of Dash apps yourself through platforms like Heroku or Digital Ocean. If you have the resources, consider purchasing a support plan to get one-on-one help from a Plotly engineer. If you need more specialized help or would like to fund specific feature development, reach out to our advanced development program. Open source is still a new idea for product companies, yet at the end of the day, we’re able to dedicate more than half of our staff towards open source products. Huge thanks to everyone who has supported us so far ❤️ Thanks for checking out Dash. I’ll be giving a talk about Dash at SciPy this summer in Austin and in next fall at Plotcon NYC. If you’ll be at either of those events, please say hi! Otherwise, I’ll see you on GitHub ✌️🏼 Our Dash documentation is hosted at https://plot.ly/dash All of our open source work is in our GitHub organization at https://github.com/plotly If you’d like to fund specialized features, reach out to our Advanced Development team: plot.ly/products/consulting-and-oem/ You can find us on Twitter at @plotlygraphs. If you’re looking for inspiration in user interfaces for technical computing, I highly recommend Bret Victor’s essay on What Can A Technologist Do About Climate Change? In particular, the sections on Technical computing and Media for understanding situations Related, if you find the intersect between technical computing and interface interesting, you might like Explorable Explanations You can reach out to me directly at chris@plot.ly or on twitter at @chriddyp

Luxury flat residents complain rehousing Grenfell families 'unfair'

$
0
0

Residents of a luxury housing block have been slammed online after complaining that the arrival of Grenfell Tower survivors will lead to a fall in property prices. The Standard revealed on Wednesday that 68 “social housing" flats in the £2 billion Kensington Row scheme have been acquired to permanently house families from the nearby tower. Communities Secretary Sajid Javid said: “Our priority is to get everyone who has lost their home permanently rehoused locally as soon as possible, so that they can begin to rebuild their lives.” But several residents of the luxury complex, which features a gym, swimming pool and 24-hour concierge service that will be off limits to Grenfell families, complained the move was “unfair” One woman, who bought her flat two years ago, told the Guardian: “We paid a lot of money to live here, and we worked hard for it.  “Now these people are going to come along, and they won’t even be paying the service charge.” Another claimed the flats would end up being sub-let. He told the paper: “I’m very sad that people have lost their homes, but there are a lot of people here who have bought flats and will now see the values drop.  “It will degrade things. And it opens up a can of worms in the housing market.” However, others did agree with the move, complaining some of the flats were completely empty. The comments, on the day inquests were opened into the deaths of five of the blaze victims, led to outrage online Journalist Natalie Bloomer tweeted: “Rich people with so much sympathy for Grenfell survivors but god forbid they have to live near them.” And student Vonnie Sandlan complained: “This is horrendous.  “Plenty of sympathy for the victims of Grenfell tower, until it comes to where they'll live.” Another tweeter wrote: “Milk of human kindness, these people.  “I guess money always trumps humanity.” The 68 flats have been purchased by the City of London Corporation in a deal brokered by the Homes and Communities Agency (HCA). The newly built homes are in two affordable housing blocks are on a site where private homes are on offer from £1,575,000 to £8.5 million. The Department for Communities and Local Government (DCLG) said the "expectation is that these new properties will be offered as one of the options to permanently rehouse residents from Grenfell Tower". Mr Javid said: "The residents of Grenfell Tower have been through some of the most harrowing and traumatic experiences imaginable and it is our duty to support them.” Extra public money has been found to fit out the flats more quickly, and the developer has taken on more staff and relaxed working hours rules, the DCLG said, with the aim of having the homes ready by the end of July. Eleanor Kelly, chief executive of Southwark Council and spokeswoman for the Grenfell Response Team, said: "Rehousing those residents affected by the Grenfell Tower fire as quickly as possible is our main priority, and I am pleased that a significant amount of housing has now been identified." The announcement came after much anger from survivors and victims' families in the aftermath at the official response to the deadly blaze. Last week, Labour leader Jeremy Corbyn called for empty homes near the scene of the fire in north Kensington to be requisitioned to house families. An independent public advocate to help bereaved families after major disasters was announced in the Queen's Speech earlier on Wednesday. The speech confirmed plans for a public inquiry into the tragedy and a new strategy for resilience in major disasters could include a Civil Disaster Reaction Taskforce to help at times of emergency, and an independent advocate will support those affected and help them at inquests. After the speech Theresa May apologised for the failures by local and national government in responding to the Grenfell Tower fire. Addressing the Commons on Wednesday, the Prime Minister said the initial support on the ground for families was "not good enough" with people lacking basic information about what they should do and where help was available. More about: Grenfell Tower Kensington Reuse content

New GeoPlanner Course – It’s Free!

$
0
0

Interested in learning more about GeoPlanner? Interested in green infrastructure planning in GeoPlanner? Esri Training recently released a new course that will introduce you to green infrastructure concepts and how to use GeoPlanner’s analysis tools to discover patterns and phenomena in data. Exploring the Green Infrastructure in Your Study Area The course is the first of three and is exploratory. You’ll learn how to create a new GeoPlanner project, use the GeoEnrichment service to add demographic information to a layer, and visualize that layer in 3D to gain a different perspective of risk. You’ll also execute the Create Buffers and Find Existing Locations tools to generate layers that model an impact and flood zone area. Performing analysis to reveal risk or using visualization techniques like 3D and classification to change your perspective are concepts you can use in any GeoPlanner project. This course is free to anyone with an Esri account. Learn more about Esri accounts here. You’ll also need a level 2 publisher account on ArcGIS Online with a GeoPlanner license. If you don’t have these, don’t worry! Sign up for the free trial by following these instructions. Click on the Don’t have an ArcGIS Online subscription link.

Geography Increasingly a Priority for Schools, Employers

$
0
0

More than fifteen years after first hitting the market, the Advanced Placement Human Geography course remains one of the College Board’s fastest growing offerings with nearly 185,000 test-takers in 2016. Experts who have followed the assessment’s yearly double-digit growth say that the rise of the assessment’s popularity coincides with increased career opportunities for those with skills in Geographic Information Systems—an industry that has mushroomed with technological advancements and the proliferation of quality open source and commercial software programs. Chris Tucker is the chairman of the board of the American Geographical Society, which is based in New York and bills itself as the oldest geographical society in the United States. Tucker, who spoke with InsideSources by phone, is not a geographer by training but has worked for a number of businesses and nonprofits active in the geography space. One of Tucker’s more recent projects, MapStory.org, allows users to create and edit maps on everything from avian migratory patterns to the political evolution of Central America. Tucker attributes geography’s return to mainstream prominence in part because of the subject’s interesting perch at the intersection of history and the sciences. He also noted that free products coming onto the scene in the early 2000’s, like Google Earth, have “democratized” information in the space and opened the field to amateur exploration. Perhaps more importantly for geography’s growth, however, is a spike in the market’s demand for geographically talented workers. While AP Human Geography is a good introduction to the field, those serious about working in the space should acquaint themselves with some of the more technical skills in areas like Geographic Information Systems, (GIS), or remote sensing tools, said Tucker. “The same way that 20 years ago you could tell an employer, ‘I’m good at Excel, or I can take numbers and do interesting things with them,’ GIS has become the next wave,” he said, while also explaining that organizations from businesses, to nonprofits, to government agencies are hiring workers who specialize in geographical analysis. “Whether its real estate, or local government, or transportation companies, there is an explosion in the number of jobs to apply those hard skills,” he said, using the example of a municipality that may want to review satellite data to check where new construction has occurred and verify that permits have been applied for and taxes have been paid. To support the next generation geographers, the American Geographical Society is in its second year of running a fellowship program that trains AP Human Geography teachers in open source mapping software. The goal is to get the educators to become proficient in using the free applications and then sharing that expertise with students and colleagues. The teacher-fellows are then encouraged to hold “mapathons” with their classes, which provide students the kind of hands-on active learning experiences that learning science experts increasingly say is most beneficial for knowledge comprehension and retention. Last year, for example, the fellows practiced by cooperating with the U.S. Census Bureau on a project to map out the areas in and around all of the U.S. national parks. In other cases, said Tucker, students are earning community service hours by remotely mapping rural areas in third world countries that are then used by aid organizations to deliver healthcare or food supplies. In general, the availability of free or open source data platforms has not always been good news for the incumbents in other industries. Education publishing companies for example have been reeling in recent years from the competition of freely available or cheap web-based lesson plans. Tucker, however, does not believe that the same dynamic is at work in the geography space, saying that he thinks the public and private sides of the market are “feeding off one another” as the sector continues to grow. On the educational side, Tucker said that his group and others are collaborating with the College Board to add another AP Geography class—one that would be more focused on the hard skills associated with GIS and the latest in GIS technology. Between the new courses and the more humanities-oriented course currently offered, the hope would be to prepare students with a solid foundation for a major in geography at the college level. Tucker, whose academic background is in political science, signaled optimism that the field will continue to grow, if for no other reason than he believes that “a large swath of humanity is actually closet geographers—they love maps, they love navigation, and they are curious about the world around them.” Follow Leo on Twitter Subscribe for the Latest From InsideSources Every Morning

Design and publish beautiful maps

$
0
0

Quickly create beautiful interactive maps and data visualizations using Mapbox Studio. For basic customization, create maps with our online Map Editor. From geocoding and routing to offline caching, our open API is more than just beautiful maps. We make it easy to power everything geo in your application. Mapbox servers are fully redundant, with automatic failover. If a server fails, traffic is re-routed to ensure maps are served without downtime. Mapbox serves maps from 30 globally distributed edge servers. With an edge server close by, Mapbox is fast no matter where you are. Fast and knowledgable support is part of every Mapbox plan, and our enterprise customers get incredible same-day help.

Spotify

$
0
0

Unfortunately your browser is currently not supported. You can also download the Spotify desktop client for access to all our great features.

The CIA Is Celebrating Its Cartography Division’s 75th Anniversary by Sharing Declassified Maps

$
0
0

As much as James Bond is defined by his outlandish gadgets, one of the most important tools for real-life spies is actually much less flashy: maps. Whether used to gather information or plan an attack, good maps are an integral part of the tradecraft of espionage. Now, to celebrate 75 years of serious cartography, the Central Intelligence Agency has declassified and put decades of once-secret maps online. These days, the C.I.A. and other intelligence agencies rely more on digital mapping technologies and satellite images to make its maps, but for decades it relied on geographers and cartographers for planning and executing operations around the world. Because these maps could literally mean the difference between life and death for spies and soldiers alike, making them as accurate as possible was paramount, Greg Miller reports for National Geographic. “During [the 1940s], in support of the military’s efforts in World War II...cartographers pioneered many map production and thematic design techniques, including the construction of 3D map models,” the C.I.A. writes in a statement. At the time, cartographers and mapmakers had to rely on existing maps, carefully replicating information about enemy terrain in pen on large translucent sheets of acetate. The final maps were made by stacking these sheets on top of one another according to what information was needed, then photographed and reproduced at a smaller size, Miller reports. All of this was done under the watchful eye of the then-26-year-old Arthur H. Robinson, the Cartography Center’s founder. Though World War II-era intelligence services like the Office of the Coordinator of Information and the Office of Strategic Services eventually morphed into the C.I.A. as we know it today, the Cartography Center was a constant element of the United States’ influence abroad. Looking through the collection of declassified maps is like looking into a series of windows through which government officials and intelligence agents viewed the world for decades, Allison Meier reports for Hyperallergic. From the early focus on Nazi Germany and the Japanese Empire, the maps show shifting attention towards the Soviet Union, Vietnam and the Middle East, to name just a few examples. As interesting as these maps are to look at, it’s sobering to remember that they played a major role in shaping global politics of the 20th century. These were the documents that U.S. government officials relied on for decades, whether it was predicting global trade in the 1950s or preparing for the Invasion of the Bay of Pigs in Cuba in the 1960s. Intelligence briefings may more often be done digitally these days, but whatever medium a map is made in, knowing where you are going remains critical to understanding—and influencing—world affairs.

EU Migration to and from the UK

$
0
0

The recent UK government defeat on its Brexit bill by the House of Lords based on the demand that ministers should guarantee EU nationals’ right to stay in the UK after Brexit was just the latest tale in the debate about EU migration and the United Kingdom’s role in it. The topic of migrants in the UK was an important element of the EU referendum campaigns in 2016 which led to the decision to leave the European Union. The government’s position sees the question of the rights of EU migrants as part of the upcoming negotiations with the EU where also the rights of UK citizens living in the European Unions need to be agreed. In terms of absolute numbers, this is a much smaller share of affected people (approximately 1.2 million UK citizens are estimated to live in other EU countries) compared to other EU citizens living in the UK (estimated at around 3.2 million). The following two maps show these numbers in comparison in their geographical dimension. Using the most recent annual estimates (published in late 2016 by the ONS and further data from the UN) the cartograms show the countries of the EU (excluding the UK) distorted by the number of UK citizens who are living in another EU country (left map) and by the number of other EU citizens who are living in the UK. The countries are shaded by their ratio between the number of UK migrants in the EU and the number of other EU migrants in the UK: (click for larger version) These maps show, that in absolute numbers the largest number of British migrants in other EU countries live in Spain (309,000), followed by Ireland (255,000), France (185,000), Germany (103,000) and Italy (65,000). On the other side, Polish citizens are the largest migrant group from the EU living in the UK (approximately 916,000), followed by Irish (332,000), Romanian (233,000), Portugese (219,000) and Italian (192,000) citizens. Looking at the migration flows between the UK and individual EU countries in relative terms, Luxembourg, Cyprus, Spain, France and Belgium are those countries that take significantly larger shares of UK citizen that those who move from there to the UK, while on the other end the UK is more attractive for citizen from Latvia, Romania, Lithuania, Estonia and Poland than it is for UK citizens who moved there. It is important to keep in mind that these numbers are estimates which contain uncertainties since the UK has no population register. The numbers are also quite dynamic, and not least the decision to leave the European Union could lead to changing trends in migration flows between the UK and the rest of the EU. The following video from the Oxford Migration Observatory explains the underlying numbers that were used in the above maps: The content on this page has been created by Benjamin Hennig using data by ONS and UN. Please contact me for further details and the terms of use. (Visited 78 times since December 2015, 83 visits today)

2017 National Geospatial Preparedness Summit

$
0
0

The Nation’s only preparedness summit dedicated to advancing the use of location-enabled decision support technology and data. Registration is now Open! When: August 7-9th Where: University of Alabama, Tuscaloosa The National Geospatial Preparedness Summit (NGPS) brings together public safety practitioners and GIS responders – to build skills in developing and implementing GIS-based decision support tools, share best practices, build peer relationships, and validate skills and capabilities through workshops and exercises. During the three days, you will have the opportunity to participate in: Day 1 – August 7 Hands-On Technical Training for GIS responders and practitioners National Data Preparedness Workshop for public safety operators, decision-makers, and GIS leaders Site Visit to the National Water Center and its Emergency Operations Center Workshop with USGIF on Strengthening Community Resilience through Location-Enabled Data, Technology, and Analysis Map Gallery Competition and Networking Social – Reserve a spot in the Map Gallery by July 1 Day 2 – August 8 National Plenary Sessions featuring Keynote Speakers leading the charge in advancing the use of GIS by first responders nationwide Best Practice Lightning Talks and Panel Sessions focused on lessons learned from real-world events – Submit an abstract by April 31 for consideration in presenting or serving as a panelist Workshops across disciplines to develop actionable solutions for advancing GIS use in preparedness and emergency operations NAPSG Foundation Reception for the Annual Awards in Public Safety GIS Excellence – Stay tuned for the award nomination period opening soon! Day 3 – August 8 Keynote Speakers Functional Exercise & Stress Test to validate your ability to develop and apply GIS-based tools for decision support in a disaster scenario Hot Wash and Debrief to develop strategies for implementing skills gained, best practices, tools, and resources discussed over the three days Travel Scholarships: As in past years, a limited number of travel scholarships will be provided to eligible applicants. The travel scholarship application deadline is May 31, and all applicants must also register.         Who should attend?* Public Safety Officials & First Responders – Management level & operational public safety. Disciplines include: Emergency Management, Law Enforcement, Fusion Centers, Fire/EMS, and Search & Rescue – Summit content will be applicable to all disciplines. National Guard & Military Personnel – State and National Guard Bureau and military leaders who work in, or are interested in, advancing the use of GIS for domestic operations and homeland security. GIS Practitioners & Responders – State GIS Coordinators/GIOs, GIS Technicians, Specialists, GIS Managers, Technology Coordinators, Local CTOs, IT Specialists, and Interoperability Coordinators. Federal Agencies – FEMA HQ and Regions, Depts. of Homeland Security, Justice, FBI, Energy, Transportation, NOAA, DOI, etc. Volunteers that Support Disasters – American Red Cross, Crisis Mappers, etc. Infrastructure Owners & Operators – Railroads, Airports, Utility Providers, Water and Wastewater Providers, etc. Private Sector – Only available to event sponsors. *Due to space limitations, registering does not guarantee a seat at the Summit.  Participants will receive a decision notification via email within two weeks after registering.  Hotel room block details will be provided to all approved participants.           NGPS Points of Contact: Logistics, Registration, and Hotel Information: Angela Pervel Telephone: (202) 895-1711 apervel@publicsafetygis.org Agenda, Speakers, and Sponsorships: Rebecca Harned Telephone: (202) 895-1711 rharned@publicsafetygis.org

15 years of migration in 15 mesmerizing maps

$
0
0

President Trump's administration has made repeated claims that its proposed immigration bans are meant to stop the flow of refugees across US borders. But what does that flow actually look like compared to the rest of the world? Global security expert and research director at the think tank Igarapé Institute Robert Muggah knows. Earth TimeLapse, an interactive platform created by Muggah and Carnegie Mellon University, details over a 16-year span from 2000 to 2015 where migrants are leaving and arriving. Data comes from the United Nations High Commissioner for Refugees (UNHCR). Each red dot represents 17 refugees arriving in a country, while yellow dots represent refugees leaving their home country behind. The resulting maps are nothing short of mesmerizing. 2001 saw roughly 500,000 refugees fleeing primarily Middle-Eastern countries, such as Afghanistan and Macedonia, and African countries, such as Sudan and the Democratic Republic of Congo. By 2002, both the number of newly displaced refugees and total refugees had fallen (since 2001). Still, large numbers of people fled war-torn African countries for safer, neighboring nations or havens in Europe. Due to the War in Darfur, 2003 primarily saw an outflow of refugees from Sudan to nearby Chad. The UNHCR estimates roughly 100,000 new refugees came from Sudan alone. Sudanese migration continued into 2004, in addition to waves of Afghani refugees fleeing into Pakistan and Iran. The US also saw an uptick that year due to an influx of Somali refugees. By the end of 2005, the total number of refugees had hit its lowest point since 1980, with an estimated 8.4 million refugees worldwide. That's 1 million fewer than the start of 2005. Pakistan and Iran were the two main asylum countries. Within the year, the number climbed back up to 9.9 million refugees worldwide. 2006 saw the Lebanon War drive people into Syria and Germany, as well as the Sri Lanka Civil War drive citizens to India. Numbers continued to rise, to 11.4 million refugees, by 2007. Moves were driven by ongoing conflict in eastern Africa, and now South America. Colombian refugees fled to Venezuela, Ecuador, and the US. An uprising in Myanmar also pushed citizens south. In 2008, Afghan and Iraqi refugees accounted for half of the total number gathered by UNHCR. India also became a major asylum for roughly 100,000 Tibetan refugees. Somali refugees continued to pour into surrounding nations. Germany became the epicenter for asylum seekers in 2009, due to a plea from the European Union for it to resettle displaced migrants. Iraqi refugees continued to make up a major share of the total population. Thailand saw surges of Myanmar refugees in 2010, due to ongoing civil war. A wave of migrants from Cuba also made their way to the US and Puerto Rico that year. In 2011, Congolese, Sudanese, and Somali refugees made up the largest portion of displaced persons. Flows slowed to Europe, while Haitian refugees made their way to the US and Dominican Republic. The Syrian Civil War was in full swing by 2012. Tens of thousands of migrants flowed south into Jordan and north into Germany, Sweden, and other European countries. Infighting in Sudan and Somalia held steady. The spring of 2013 saw an influx of Ukrainian refugees fleeing the civil unrest caused by mass protests. People fled mostly to Russia. Syrian refugees continued to enter Germany, while dual conflicts in the Central African Republic and Sudan led to swirling movement patterns. Russia's annexation of Crimea in 2014 produced an explosion of refugees in Ukraine. South Africa also began accepting people by the hundreds of thousands, as they came from Somalia and landlocked countries further inland. By 2015, the Syrian refugee crisis had become by far the dominant movement of displaced persons. More than a million people have arrived in Europe, and a similar number in South Africa. Despite political fears, the US remains almost a non-player on the global stage. General disclaimer: The designations employed and the presentation of material on this map do not imply the expression of any opinion on the part of the World Economic Forum concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries.

Map Analytics: using Mango + Maptiks

$
0
0

| 1 minute read | We’ve integrated the fantastic map analytics platform Maptiks, so you can turn your user’s web map interactions into actionable insights and increase engagement on your web maps. Maptiks lets you to track all the good stuff like bounce rates, visit duration, and activity per map load, as well as map-specific functions like pans, zooms, and geographic areas users viewed. Sample report from Maptiks It’s simple to start tracking map analytics for your Mango maps with Maptiks – just sign up for a free trial over at Maptiks. Drop your tracking ID into Mango and all your public maps will start tracking. Alongside Maptiks is our new Google Analytics integration. Using a Tracking ID you can view everything Analytics offers including Audience demographics, acquisition channels, and user behaviour. This will allow you to track general user activity but not detailed, map interaction tracking that Maptiks offer. Maptiks and Google Analytics are available on all Enterprise and Agency plans.

Giant ringed planet likely cause of mysterious eclipses

$
0
0

A giant gas planet – up to 50x the mass of Jupiter, encircled by a ring of dust – is likely hurtling around a star over 1000 light years away from Earth, according to international team of astronomers, led by University of Warwick Light from young star - PDS 110 in the Orion constellation - is regularly blocked by large object, thought to be an orbiting planet Next eclipse predicted to take place in September this year, and amateur astronomers across the world will be able to witness it Moons may be forming in the habitable zone around the star – leading to possibility that life could thrive within system A giant gas planet – up to fifty times the mass of Jupiter, encircled by a ring of dust – is likely hurtling around a star more than a thousand light years away from Earth, according to new research by an international team of astronomers, led by the University of Warwick. Hugh Osborn, a researcher from Warwick’s Astrophysics Group, has identified that the light from this rare young star is regularly blocked by a large object – and predicts that these eclipses are caused by the orbit of this as-yet undiscovered planet. Using data from the Wide Angle Search for Planets (WASP) and Kilodegree Extremely Little Telescope (KELT), Osborn and fellow researchers from Harvard University, Vanderbilt University, and Leiden Observatory analysed fifteen years of the star’s activity. “We found a hint that this was an interesting object in data from the WASP survey,” said Hugh Osborn, lead author, who discovered the unusual light curve, “but it wasn’t until we found a second, almost identical eclipse in the KELT survey data that we knew we had something special.” They discovered that every two and a half years, the light from this distant star - PDS 110 in the Orion constellation, which is same temperature and slightly larger than our sun - is reduced to thirty percent for about two to three weeks. Two notable eclipses observed were in November 2008 and January 2011. “What’s exciting is that during both eclipses we see the light from the star change rapidly, and that suggests that there are rings in the eclipsing object, but these rings are many times larger than the rings around Saturn,” says Leiden astronomer Matthew Kenworthy. Assuming the dips in starlight are coming from an orbiting planet, the next eclipse is predicted to take place in September this year – and the star is bright enough that amateur astronomers all over the world will be able to witness it and gather new data. Only then will we be certain what is causing the mysterious eclipses. If confirmed in September, PDS 110 will be the first giant ring system that has a known orbital period. “September’s eclipse will let us study the intricate structure around PDS 110 in detail for the first time, and hopefully prove that what we are seeing is a giant exoplanet and its moons in the process of formation," comments Hugh Oborn. The researchers suggest that there are moons could be forming in the habitable zone around PDS 110 – pointing to the possibility that life could thrive in this system. The eclipses can also be used to discover the conditions for forming planets and their moons at an early time in the life of a star, providing a unique insight into forming processes that happened in our solar system. The research, ‘Periodic Eclipses of the Young Star PDS 110 Discovered with WASP and KELT Photometry’, is due to be published in the Monthly Notices of the Royal Astronomical Society. Image: Artist's impression of the giant gas planet orbiting the star PDS 110, credit University of Warwick. Click image for high-res. 31 May 2017 Further information, contact: Luke Walton, International Press Officer +44 (0) 7824 540 863 +44 (0) 2476 150 868 L dot Walton dot 1 at warwick dot ac dot uk -------------------------------------------- Click here to see full paper.
Viewing all 5792 articles
Browse latest View live