Meet Scabbycat. He pitched up in our front yard last week sometime.
He’s okay, bit neglected looking — but doesn’t have plans to leave, so we’re looking after him as best we can. He can’t come to live in our house as he might have something that our cats could catch, but we’ve got him some shelter and are feeding him. The vet’s has confirmed that he’s not microchipped (so we can’t find his owner), and is reasonably healthy, but there’s nowhere for him to go for a few weeks — none of the cats homes or sanctuaries we’ve contacted have a space.
We’ve tried tweeting and blogging about him, but the chance of connecting to his owners is only improved if they are good at searching the social web.
There is, if you Google, a ‘National Missing Pets Register’ website. It’s nothing official, rather a altruistic effort by a web designer called Steve Dawson — but it has some sort of traction, and visibility is all in this instance. There aren’t a great deal of lost/found notices on there, but it’s the main site certainly.
The site notes that developments are still ongoing — search on the site would be an easy win in the sense of making the site better (no idea how easy technically), as would listing by location (the tighter the better), notifications, RSS feeds and better photo handling.
A few of these would help other people spread the information (location-based feeds especially), but there’s nothing here that harnesses the social power of the web — so I’m going to throw some ideas about, maybe there’s a service someone could build in here.
For me the problem is that the only people on the site are within the transaction — they’re the people who have either found or lost a pet (and they’re a subset of all of these even). We need something than can use the connections and serendipity of the social web to increase the chances of a reuniting.
With tighter location feeds the site could power lost/found pet widgets for local sites and blogs. That would increase the likelihood of some who can make a connection spotting the pet — but also increase the awareness of the site itself.
Is there something in game mechanics? — Possibly attempting to match descriptions or photos of lost/found animals (currently no way to really search for matches) or in some way improving the descriptions/tagging/locations of found animals.
What could be done with mobile or locations? — Is there a way that “spottings” (helpful, but not as good as a find) could be registered easily? Could you add poster information or information found on the streets but not added by the owners for whatever reason.
Direct feeds in from dog wardens, cats/dogs homes/police/vets — could a site it be useful for them too. Making it as easy as possible for the overworked needs to be given thought.
Is it ‘fixmypet’?
Can you add? Or more importantly can you build or fund?
Lots of good clean fun on Twitter this morning where Paul Miller (@rellimluap) pointed to an “Australian Fix My Street“. It’s just a mock-up, created at the Canberra govhack hack day, but it’s great — mainly because of it’s title:
A lovely dose of self-aware humour “it’s buggered mate” — which I think is an important engagement tool. Not only does it get attention initially, but it starts the “community” of users off with them having a good idea who a site is for — it’s for “them”, and it’s for people who don’t take themselves too seriously. A bit of fun is a very powerful thing.
Here’s a pipe I’ve created that attempts to marshal the content from hyperlocal blogging in Birmingham and allow people only to subscribe to feeds that interest them. This is a piece of investigation and experimentation that I’ve been able to find the time to do thanks to Will Perrin and his hyperlocal blogging initiative Talk About Local. Will also helped define the reason why it would be useful to do — for what he called “lazy journalists”.
Lazy here is used in the same way that it might be used — in praise — of a computer programmer; that is, lazy means you’ll work hard at setting yourself up right to make sure you get everything you need easily later on. Will got to the crux of the argument by saying that journalists interested in a subject — let’s say noise abatement issues — could easily find examples of those at a local level outside the areas they physically know.
So this is a run through of the decisions made in building it (and what other options could work), it’s no more than a prototype at this stage so comments and improvements are very welcome. However if you would rather just get stuck into the pipe itself, head on over.
Supposedly “hyperlocal” thing from Yahoo, hasn’t quite got the “local” bit, yet.
I spent an enjoyable hour with Kate Foley late last week, Kate is Neighbourhood Manager in Lozells Birmingham and runs the Life in Lozells blog. The site has been running since March 2007, and is an invaluable resource for local info — but Kate is interested in building more of a community around it, generating and hosting conversation as well as collecting information.
I suggested that an injection of opinion in to the blog might help that, which is something that it’s difficult for Kate to do in her official capacity — two possible solutions came to mind:
The first relies on use of Kate’s real-world network, pulling voices in to contribute, the second can be done in a more online way but will rely on Kate becoming confident in using search and RSS and building her online connectivity.
Those of you with local blogs, how do you work to build up the conversation?
There’s a new digital divide, a fissure opening wider and wider as the social web makes encroachment into most forms of information. You may have heard of the ‘darknets’ — unseen networks of computers for filesharing — networks you’re only allowed onto if you’re trusted not to give the game away. What I think I’m seeing the emergence of is almost the exact opposite, but increasingly disconnected.
There are hundreds of websites, lovingly researched and maintained by enthusiastic and knowledgeable people, that it’s becoming almost impossible to find. The sites are built on old technology, and that contributes to their decreasing visibility but it’s not the only reason. The lack of RSS feeds, pinging servers, dynamically generated sitemaps and up-to-date robots.txt files makes it more difficult for other sites to keep in touch with them. That they are often built in HTML by hand makes them more difficult to update, and fresh content is prized by search engines.
The lack of RSS and knowing when updates occur also decreases people’s awareness of the sites, you either have to remember they’re there and how you found them — or bookmark — and check for updates on a regular schedule. Less reminders, less nudges, so less incoming links.
They are often in very niche areas, like local history, local news, and so generate a limited number of hyperlinks from other sites. They often only get links from each other, which is great in a community sense but means the incoming links are low in pagerank (which would push them higher up Google searches).
As more and more sites get better and better at search engine optimisation, as blogs and other social websites link and link again and expand into more areas, and as Google relies on the same sources more and more the sites are getting less and less visible.
And that’s bad because they have a wealth of important content that we need to be able to find.
I’m calling it the ‘hinternet‘.
(From hinterland, in German the part of a country where only few people live and where the infrastructure is underdeveloped.)
Solving the problem is a tricky one: Google’s mission to index the entire World’s information doesn’t always mean that we can find what we’re searching for, the semantic web will only work if the correct metadata is stored with the hinternet sites (and they’re already often “behind” technology-wise).
Search needs to get better, but us on the social-web also need to help. We need not only to link to these sites, but — where we can — help nudge the guardians of the hinternet towards greater visability by becoming “social”.