I tried switching to OSM in an effort to get away from google maps. People always say 'OSM is consumer ready' but forget that OSM first and foremost is the dataset. And as far as i am concerned, there is no good maps app on the iOS appstore that even comes close to google maps. and i think i tried pretty much all relevant ones. skobbler (which i just found out is part of coast) seems to be the most polished. but as long as there is no unified search and you have to enter streetname/adress/city in seperate boxes which excludes searching for the names of buildings (for example university buildings) for which the names clearly exist in the database and are shown onthe map, i have no choice other than switch back to google.
i really hope this changes soon and the UI/apps catch up to the greatness that is OSM.
edit: i have an iphone and although i live in canada i still use the german appstore (CC requirement).
that's why i can't comment on the app the original post was about as it's only available in the US store. it was more of a general remark about my frustration with the state of apps using the osm dataset. I want to get rid of google maps and I feel the maps part of osm would be ready for that. I can tolerate if the commercial store data is not as up2date. But if the general usuability suffers, i rather opt for gmaps.
I tried using OSM a month ago when I was travelling around Bali, Indonesia. Google Maps is terrible there, so it wasn't like I had much choice. I looked on Google Play and found an app called "OsmAnd Maps & Navigation" that was well rated, it was also offline which was a plus as there is limited mobile coverage outside of the main parts of the island. I downloaded it then the dataset for Indonesia (200mb, it took a while), and tried finding a route. After a few minutes (!) it had calculated one, but the route definitely wasn't optimal. I tried another app "Sygic: GPS Navigation & Maps", it was premium but had a 7 day free trial (perfect for my 3 day road trip). The route it calculated seemed a lot better, so I went with that. I was actually really surprised by how well it worked, even in the north of the island which is quite undeveloped in places it was able to find the best routes.
TL;DR; Even though the dataset may be ready, you still need supporting consumer ready apps to use it.
(Originally posted this as a separate comment, but it is pretty relevant to this).
Have you tried the Scout App (which the article is about)? How does its search work?
I agree that geocoding has been a problem for OSM, especially for building numbers, where the data often isn't there. But on the software side there seems to be progress recently, for instance, here are two open-source geocoders released in the last few months:
On building numbers, we need geocoders which can interpolate between sparse data, and then use that system to highlight where the gaps are. People tend to add data to OSM when they're confident that it's actually being used, to its full extent. If there are a hundred buildings along a road, in theory you don't need to add many of them before you can get a very good idea of where an address might be.
The map should communicate that it is interpolating the geocoded address by a circle of probability. Or even show known addresses and then an alpha blended triangle between them.
As you are saying, assistance from a mapping service need not be binary, just add some extra information and communicate cleanly the probability of result.
Interesting projects! I have tried to use the Mapquest geocoder API but for anything outside the US it's worse than useless. Will definitely check those out. It's great that they're based on Elastic Search since obviously geocoding is about full text search more than it is about "geo".
not even Google Maps have full building number information. at least not world wide - I've been to several European countries where building numbers are ignored in Google Map searches (sometimes you actually get no result at all if you include a building number)
Agreed. Google Maps isn't just about addresses. For example I just searched for "museum about noodles in tokyo" and it correctly came up with the Shin-Yokohama Ramen Museum. I typed in "mona lisa museum" and it came up with The Louvre.
It's not perfect but it doesn't seem like a map based solely on map data is going to be able to compete with one that has all of the internet behind it.
This is my biggest complaint about Apple Maps on iOS. Searching doesn't work. And if I get an address or name partially correct, Apple Maps usually fails.
Google can find me places of interest, without an exact name or address. That's my most common use case when using navigation on my mobile device.
It's not just maps. No matter where you are searching, Google is very aggressive about trying to figure out what you meant. Almost no one else bothers to try. Amazingly, even Amazon didn't try to correct misspelled product searches until recently (a year or two ago, IIRC).
That's a huge and under-appreciated advantage for Google's mobile apps and services.
The screenshot they posted on top of the blog site is very illustrative. Although I may be used to the "Google Maps look", the OSM map displayed there is much harder to understand. I can't put my finger on it exactly, but there's too much color and detail I'm not interested in when looking at a map. This feeling does not occur to me when I use Bing Maps or Here Maps either (and I have used these quite a bit in the past).
Let's hope the increased use of the incredibly detailed and free OSM data will one day lead to better applications that make use of it. User experience comes first.
I just tried the OP's app (Scout Maps), which is free, and it looks pretty good so far for where I'm at. Not as clear or concise as Apple or Google maps, but the road data seems good so far.
Isn't apple based on OSM? Which basically means we got more players here. All that's left is for google to jump on the bandwagon and we're in business.
Apple used some of the old data, before OSM changed its license to share-alike. That data is probably still in there somewhere, but I think it's very unlikely they are incorporating any OSM updates, past about five years back, because the license would mean they'd have to release their combined dataset.
1) cryptostorm would still have the ip the isp assigned a user. mapping this ip to a real name is trivial. right?
2) The cryptostorm team decided to remain 'pseudoanonymous' at this point. The points they outline (privacy activists get constantly hassled and threatened) make sense but don't help me verify the integrity of the service. You saying that you spoke to them and that they are trustful doesn't do much either. Why should i trust them? I know in the end i should trust no one, but your argument boils down to cryptostorm being outside of FRA jurisdiction?
You could argue that iPredator is not a real crypto/security vpn service in the first place. I think they are using 128bit encryption which can be cracked if enough effort is put into it. They are just making it more difficult, eliminating 'drive by snooping'.
Lastly: no offense, but that website is not very trust-inducing. i know it shouldnt matter but still....
No, my argument boils down to "cryptostorm doesn't know who you are". They isolated their accounting (the part that has to collect money and ties an individual to an account) from their VPN service. They compartmented their operations from their business. So their customers are anonymous to them.
You purchase an access token (time limited from first use) from a third party (cryptostorm offers bulk rates for resellers). The entity which sells the tokens is based in a First Nation in Canada, meaning it has reduced legal attack surface. This entity is distinct and separate from cryptostorm.is the VPN service provider. They are compartmented and share no information. Neither one has sufficient information to link a specific individual to any activity.
That's the beauty of what they've done, they've made it so that you don't have to trust them. As I said, they could be compromised and log everything, it doesn't matter. They cannot tie an account to an individual. That's the problem that they solved. They removed trust from the equation.
Now, indeed, you should not use a VPN for anonymity. That is not what they are designed for and that is not what they are capable of providing. However, given that the cryptostorm VPN service can only know:
* you originating IP,
* your (anonymous) token ID, and
* the packet stream that exits their servers...
You can easily ensure a level of anonymity to your internet usage by accessing the VPN from an IP that is not associated with you (eg public library, coffee shop, etc). Provided you maintain discipline and never access it from an IP "owned" by you, they cannot know who you are.
I've spoken and written before about how VPNs are not tools for anonymity. A recent example is a "no logs" VPN used to catch a kid sending bomb threats to his school [0]. A VPN service is essentially just a proxy, and no single hop proxy is going to deter a nation state level actor. VPNs are tools for privacy, circumventing stupid IP restrictions, and evading (some) network access controls. They're not safe for robust clandestine activity.
I've spoken with them and they are competent, have been doing VPNs for years, and are passionate about privacy and security. That doesn't mean I trust them. The beauty of their architecture is that I don't have to.
I am really looking forward to TextSecure for iOS. I hope I am wrong on this one, but from the text on their Website Heml.is doesn't seem to be too eager to open source their code after release.
I don't know any details about whispersystems (except that moxie marlinspike is with them) but I sure do hope they can provide a well designed cross platform messaging app completely open source (which I don't think exists yet)
"We have all intentions of opening up the source as much as possible for scrutiny and help! What we really want people to understand however, is that Open Source in itself does not guarantee any privacy or safety. It sure helps with transparency, but technology by itself is not enough."
They have no intention of releasing the source code. Use https://www.surespot.me/ instead, it does the same stuff, already exists, and is released under the GPL (v3).
Surespot depends on a bunch of google play services and is officially distributed on google play. Is there a way to install pre-compiled surespot apk outside of google play for those that don't install proprietary google code on their phone? I noticed that the open-source android apk repository F-Droid can't distribute it for this reason: https://f-droid.org/forums/topic/surespot-encrypted-messagin...
(Side note: moxie prohibits TextSecure on F-Droid as there is no forced auto update like google play. I currently have to download and compile the TextSecure source code myself, which is no biggie, but as a CM user, I'm definitely excited about this integration!)
Personally, I've used both, but settled on SureSpot for the moment. SureSpot uses data exclusively, which is cheaper than SMS for me. Although I understand that TextSecure now has (or will be getting soon) a data channel. So I'll definitely take another look.
Moxie has proven himself to be more than capable of building such a system, but the author of SureSpot seems more than competent too. See the section titled "Technical Overview" on:
Interesting fact: TextSecure wasn't made open source until it was bought by Twitter: https://dev.twitter.com/blog/whispers-are-true - IIRC, prior to this the website claimed it was open source, but offered no way of getting the source, and if you asked for it, you would find out it was only given to trusted third parties to perform security reviews.
Sounds like a trap to me, not having the source code. Maybe it works for now but investing trust as things are now I'd rather go with opensource client.
Trust must be earned, so far it they brag about way they made tech work with a patched version of android - they don't really put forth anything that will give them credibility as a very secure protocol.
People insist on looking at this through their default prism of "closed source bad, open source good". But people with crypto experience have other prisms; for instance, "competent, well-vetted crypto" versus "amateur enthusiast crypto". Sometimes open source is also competent and well-vetted, but vetting is expensive, and there is a lot of amateur crypto out there.
> Sometimes open source is also competent and well-vetted, but vetting is expensive, and there is a lot of amateur crypto out there.
You seem to be implying that one must be a hobbyist in order to write incompetent crypto software with no or incompetent review and tend to need company resources to get quality code reviews.
Having crypto is often an important checkmark and tack on for shipping a product and usually no one in the product group is competent to analyze the security of the way they tacked on encryption. If a few in the larger company are competent, they will avoid reviewing these projects. Being the engineer everyone associates with delays and frustrations doesn't do much for you and there will never be any proof of the costs you may have prevented.
The few better than I know how to criticize implementations that I have seen haveusually had considerable cross company and university involvement. That usually means open source or a lot of NDA and complex license agreements for cross organization code sharing.
I have no idea what you're trying to say here, but just a random stab at responding: my perspective in this discussion comes from managing a consulting practice that, among a few other things, specializes in assessing the security of cryptographic implementations.
I've been in a role of evaluating security vulnerabilities on security products and features from many different origins..
All I am saying is that I am in a position to estimate ~9/10 of everything critically exceeds the competence of its authors to safely combine features and security. So a primary explanation for failure that only applies to 40%(60%?) of the market doesn't sound right to me.
So either we disagree considerably on proportion of software that is poorly implemented or you are saying the majority of commercial software is also written by hobbyists?
I stifled the urge to say the same thing, but then realized that I'd lose the evening to defining what "mainstream" meant, after people dredged up random examples of snake oil from Schneier blog posts; not to mention the inevitable rehashing of the "beware custom algorithms and 390244 bit keys" thread, which is going to have to happen now because bringing up crypto truisms from the late-90s makes people feel smart.
You're being unfair and you know it. Lavabit, the RSA fiasco, Apple's imessage crypto, etc. are all perfectly mainstream examples of closed-source crypto done wrong. As you said yourself, the only thing that conclusively makes a difference is if the crypto is "well vetted," having the source available is simply a means of making this easier. Classifying the quality of crypto-implementations based on the source model alone ("The track record of open source cryptography is bad.")is just disingenuous.
No, I'm not being unfair. I don't think "open source" versus "closed source" has much at all to do with how secure a cryptosystem is. I do think having Trevor Perrin and Moxie Marlinspike working on your crypto design has a lot to do with how secure a cryptosystem is.
Yes, you are being unfair. You can't say I don't think "open source" versus "closed source" has much at all to do with how secure a cryptosystem is (somewhat agreed) while simultaneously saying The track record of open source cryptography is bad (utter nonsense and misleading), unless your point is that closed source cryptography has an equally bad record.
Please explain for those of us who are not good enough in the field (I'm genuinely asking).
I was under the impression that software like GnuPG and OpenSSL could be considered safe, so seeing a security professional warning about a negative track record of open source cryptography is worrisome.
What exactly should we be careful of when it comes to open source cryptography?
Not all open source code is broken; just a lot of it is. I think tptacek is trying to say that open source vs closed source is a mediocre predictor of the quality of a cryptosystem :)
thanks for mentioning that. never knew that uTorrent is actually developed by Bittorrent Inc. That makes it indeed very hard to trust the BT Sync service
uTorrent was one of the best (if not the best) engineered Windows programs. It was small, fast and really really well thought out. Just beautiful. I would be so damn proud if it were my creation :)
Then BitTorrent came along and bought it. On one hand I'm really happy for the dev, but on the other hand it was the start of a steady decline of both the quality and the ethics. BT does some really interesting stuff, but I really wish they wouldn't pee on their own parade with all those uT ads and bundling.
best of luck. i really hope you succeed with this. it seems that right now all opensource dropbox alternatives, while great, have a shortcoming somewhere (git annex, sparkleshare, btsync...)
Can anyone who is running this explain why and how he uses it? As far as I understand it, its not a media center solution but rater a self hosted social network focused on sharing media.
How would a typical use case look like: Set up a mediagoblin instance to share vacation photos and videos with your friends?
I set this up after my first child was born so that my wife and I, along with all of the other family members could post pictures and videos of our daughter (and our family) on a machine I control. When someone babysits her and takes pictures/videos they too can post them.
This serves us in many ways. For instance, we have all of the pictures in one place, not spread over many services depending on where family members post them -- although some family members also post to facebook :( For the grandmothers and grandfathers we just give them the url instead of trying to explain a YouTube channel to them. They know how urls work, not how YouTube works.
This also serves as a backup. The server is hosted outside our apartment so the pictures we put there can be easily fetched in case something happens to our apartment (and computers - but this can also be fixed with a backup server, this is a side effect).
Basically, family album in one place -- no matter who the photographer is (and backup).
It's not an automatic backup system... and it's not a copy of all photos/videos on our computers.
The pictures/videos we want to keep (would not want to lose in a fire) we upload to our MediaGoblin instance. MediaGoblin saves them in a folder on the server (the original media files and the processed ones). If something happens to our home, we'll be able to easily fetch the most precious images from the server.
The only thing I'd have to work around is that every media file is stored in a separate folder along with the processed files so I would have to write a script to "go into each folder and fetch the file that doesn't end with medium.[ext] and thumbnail.[ext]". Come to think of it, it might be worth it to just write a "Download all originals" plugin since MediaGoblin already knows the originals... but since our apartment hasn't been destroyed I'm not in a hurry.
1. there will be not many people watching it to begin with.
2. even if there are, it's distributed across so many media goblins install. break down youtube access for (1/3 * number of videos they have) = number of different DMCA requests the MAFIAA lawyers will have to work out. It just will not scale as it does now.
yes, you need ruby for it but I dont really see that as a problem. Its a bit confusing at first (could be just me, but I still dont get why you have basically 2 repos, one managed by homesick and the one you checked out yourself) but it works fairly well. Takes care of symlinking, updating and whatever you want.
A problem I see is every car manufacturer building his own system, maybe with a standardized communication channel to speak to other cars but the control software in itself will be bound to a certain manufacturer. This means different bugs in each car, fixing a bug in one car doesnt mean the error doesnt exist in other cars and so on. New players entering the market are facing pretty much impossible hurdles to enter.
Some kind of Android for autonomous cars would be quite nice I guess.
especially when you think about the fact that a lot if not most traffic jams are ghost jams, where a slight overreaction ripples through the following cars causing them all to stop for no apparent reason.
i really hope this changes soon and the UI/apps catch up to the greatness that is OSM.
edit: i have an iphone and although i live in canada i still use the german appstore (CC requirement).
that's why i can't comment on the app the original post was about as it's only available in the US store. it was more of a general remark about my frustration with the state of apps using the osm dataset. I want to get rid of google maps and I feel the maps part of osm would be ready for that. I can tolerate if the commercial store data is not as up2date. But if the general usuability suffers, i rather opt for gmaps.