Internet Explorer 7 Beta 2 Preview is out.
Note: Runs under Windows XP Service Pack 2 (SP2) only.
via Slashdot IE 7.0 Beta 2 Available to the Public /.
I have added a section to LibSuccess: Web Browser Extensions
Internet Explorer 7 Beta 2 Preview is out.
Note: Runs under Windows XP Service Pack 2 (SP2) only.
via Slashdot IE 7.0 Beta 2 Available to the Public /.
I have added a section to LibSuccess: Web Browser Extensions
ZDNet is expanding its coverage with a series of new blogs on a variety of business and technology topics. We're also seeking lT execs to contribute to our blogs covering the education and government fields.
What's your area of expertise?
If you've got great credentials, good writing ability, and passionate authenticity, we'd like to hear from you. Please send an e-mail–with your full name, preferred daytime contact information, and a relevant writing sample–to David Grober. And yes, we'll pay for steady, high-quality blogging.
ZDnet - Between the Lines - We're looking for a few good bloggers - January 5, 2006
Science librarians are questioning the logic behind an announcement that Health Canada will shortly cut its library budget by 50% -- a decision they say will affect the ability of government scientists, clinicians, and policy makers to do their jobs.
Currently, the government is developing a pilot program called the Federal Science eLibrary. The initiative provides full electronic desktop text access to 2,700 journals for three Canadian sites, said Beverly Brown [of CISTI], a program manager with the eLibrary. However, "it's important to know that there is no funding for this initiative yet," Brown told The Scientist, although it's expected to attract a significant amount. Given the still-unknown future of the eLibrary, these library cuts are short-sighted, said Jessie McGowan, a medical research librarian at the University of Ottawa.
McGowan noted that because Canada doesn't have a national health library, like the National Library of Medicine in U.S., it relies on collaboration between libraries across the country to provide good sources of information. "So when you see Health Canada making cuts, it hurts not just Health Canada, but all research in the country, because we are so interdependent on each other," said McGowan.
from Science librarians question cuts, The Scientist, 25 Jan 2006
- Social list of books - share your reading experience with other people.
- Recommendation system - find new books to read in the categories of your interest.
- Folksonomy - everybody can use his/her own keywords to categorize books.
- Faceted classification - predefined types of keywords for authors, reading statuses.
- Friend finder - easy discover people with similar reading tastes.
- Virtual community - discuss books and authors with other readers.
Am I permitted to add...
Now, that being said, don't expect miracles, The site only reports "34499 total books", and the revenue model and plan (or as rather gleefully described on the site "global world takeover project") appears to be mainly around driving traffic to Amazon.com However, it does also link to Wikipedia Book Sources, which should land users at your library (you've set your library up in Book Sources, right?)
As I've said before, my recommendation is for libraries to get together, build a common front-end layer, then build your local community while transparently using Amazon's data. (By transparently I mean: user enters review on your site, it also gets copied to Amazon; conversely, if a user in your area puts a review directly on Amazon, it should show up on your site.) The reason to do this is Amazon has the network advantage - they have way more info than you will probably be able to generate in your relatively smaller local library community. You should be able to easily switch modes: "what has everyone on Amazon tagged this book" vs "what tags have local users put on this book".
In general you can always find publically-posted NRC jobs (including CISTI) at
Note: that the posting times are usually quite brief, that Canadian citizens have an advantage, and that many positions have bilingual imperative requirements.
Join us for the joint CAIMS-MITACS Annual Conference from June 15 – 20, 2006 at York University in Toronto, Ontario when the top academics in the mathematical sciences community come together to explore advances and share research results. The 17th Canadian Symposium on Fluid Dynamics and the 2006 Summer School on Mathematical Modeling of Infectious Diseases will be held simultaneously with the CAIMS-MITACS Annual Meeting.
This event aims to bring together partner organizations in the academic, public, private and not-for-profit sectors with university researchers to generate solutions. This Annual Conference is your opportunity to participate with others in the mathematical community by listening to top speakers, network with the mathematical community, showcase your research by presenting a poster, and socialize with your peers at a national level.
Mark your calendar!
Date: June 15 – 20, 2006
Location: York University, Toronto, ON
CFP deadline was December 15, 2005
Monday January 23, 2006
Wikipedia Canadian federal election, 2006 (39th general election)
Canadians: go vote.
UPDATE: Chris Nolan points out in his comment a Canadian Election 2006 Mashup using Google Maps. Quite cool, but I will warn you that "Loading Ridings. This will take a moment..." took more like several minutes for me.
I think this is pretty nifty, although I must say I am strongly against any election guessing (pre-election polls of any kind). I think people need to just wait for the actual votes to be actually counted.
One of the topics that came out of the SOA Symposium was around reuse, and how hard it is to achieve.
Is it actually possible? We thought we were going to get it with objects, then with (code) libraries, then with components...
Phil Wainewright has a very informative post on Learning to love software reuse
Let's face it: developers love reinventing the wheel. Anyone who believes someone else has built as good a wheel as they could build themselves isn't worthy to call themselves a geek.
Object-oriented development and component-based development have both failed to realize the promises of software reuse because they let developers touch the code. What self-respecting geek is going to leave well alone?
Only when you move to a truly loosely coupled services architecture do you get to reuse software without the temptation of making changes. Properly implemented, a services architecture doesn't allow you to look inside the service and see how it works. You only get to reuse the service just as it is.
Well, that's the theory, anyway. In practice, you come up against all sort of obstacles.
According to the San Jose Mercury News, The Bush administration wants access to Google's huge database of search queries submitted by users to track how often pornography is returned in results.
Slashdot U.S. Government Wants Google Search Records /.
The referenced article itself begins
The Bush administration on Wednesday asked a federal judge to order Google to turn over a broad range of material from its closely guarded databases.
further down, it reports
The government indicated that other, unspecified search engines have agreed to release the information, but not Google.
UPDATE: Extremely detailed report from Search Engine Watch Blog Bush Administration Demands Search Data; Google Says No; AOL, MSN & Yahoo Said Yes.
It seems I have not done a good job of tracking user scripts for Firefox, despite the fact that I think transforming the content of web pages you visit ("living where your users are") is one of the most powerful capabilities libraries should be leveraging. In fact, one may wonder whether it's worth putting effort into upgrading the OPAC interface - why not just build more user scripts?
Here are the ones I have quickly gathered
If you need help, you might try dunck.us GreaseMonkeyUserScriptRequest
I have added all of the above to
Please feel free to contribute to that wiki page.
Also see panlibus: Greasemonkey - one brand of Library 2.0 glue.
I find it interesting that a lot of blogs report every month that a new D-Lib is out.
I like D-Lib a lot, but are these announcements due to the value of its content, or out of old habit?
Um, there are feeds? There doesn't seem to one available directly from Ariadne, but anyway.
This is a nice clear description of how IT departments may behave, and how librarians can relate to them:
In a very real way, libraries have sold their souls to the company store when it comes to their integrated library systems. I've been very vocal about demanding change from vendors, and I think they're listening—not just to me, but to a growing din of dissatisfaction.
So, if you can't rely on your vendor, do you place your trust in your IT department? The answer is: it depends. Many libraries simply do not have IT departments. Many rely on one person to manage everything. Those libraries face a formidable challenge.
When it comes to larger IT departments, however, I'm going to make an oversimplified generalization. I've noticed that IT tends to exist somewhere between two poles. On one end, there's the department that simply establishes and maintains an IT infrastructure; the department's staffers do not get involved at all with policy or strategic planning. If you need a new server, they install it, if you need more RAM, they bring it. These departments tend to be run as small businesses, operating within the parameters of their budgets, making purely technical recommendations. They manage the help-desk and rarely deviate from their SOP.
On the other end are IT departments filled with creative, artistic people with a genuine interest in improving their institutions. This type of department tends to play a larger role in an organization. Staff members in this type of IT department make recommendations that have ramifications extending beyond a technical scope. They do research, and they typically have a programmer on staff who “makes it happen." They are usually very opinionated and have strong feelings about how things should be. They are much like librarians in that way.
If you are interested in starting a dialogue with your IT department, it would be a good idea to identify which one of these poles your organization's department leans toward, because that'll determine how to approach IT staff.
If your IT department is more like the first, then you ought to go to staff members with very clear ideas of what you want. It might not be a bad idea to budget some money for an outside consultant to help you put together a proposal or plan using the right language. You'll have to think of the staff members in your IT department as the fabricators of your design. This can be a challenge, because, as a non-techie, you're really unaware of what's feasible and what's not.
If the department resembles the second type I described, then you stand a better chance of success by bringing an outline of sketched ideas to the table—a rigid plan may cause friction.
from ALA TechSource - On the L2 Train
Of course as indicated above, it can also be much more complicated.
Maybe part of your IT department just does whatever they are told to do, and part is more pro-active.
Or maybe your IT department used to just follow the directions they were given, but is trying to transform to be more engaged.
Comes up with a map with all my local libraries marked (I guess it's using IP recognition to guess my location? or I had a cookie set with my location sometime). Clicking on library brings up bubble with link to library website. Simple, yet incredibly useful. Supports both Google and Yahoo Maps.
I learned some new HTML today.
This is kind of cool because it's possible it's the first HTML I have learned since like, 1994.
<acronym title="Service-Oriented Architecture">SOA</acronym>
In both Firefox and IE the title text will be displayed when you hover over the acronym, however in IE there is no markup to indicate that. In Firefox it gets a dotted black underline. Example below
The main goal is roughly "if only humans would mark up everything they write with helpful descriptive information, it would make it a lot easier for machines to understand".
The problem is, people don't even use the existing HTML markup tags to add meaning to their web pages.
You can already use <abbr>, <acronym>, <address>, <cite>, <code>, <dl> <dt> <dd>, <kbd>, <q> <blockquote>, <samp> ... but no one uses these to mark up their pages. I predict that people will continue not to do this.
I have a suggestion: make the machines smarter.
Ottawa, Canada, 23 December 2005 — Canada’s International Development Research Centre (IDRC) is pleased to announce plans to create an Open Archive, the first among Canadian research funding organizations. The Open Archive will provide full access over the Internet to IDRC’s rich research archive. In addition to making information more freely available, this initiative will provide IDRC-funded researchers with a much-needed outlet to publish and showcase their work.The world of scholarly communications is rapidly changing. The emerging culture of protecting intellectual property, soaring costs of accessing research literature, and difficulties in having research published in traditional journals are restricting the development of research capacity in the South.The Open Archive will help Southern researchers to engage in the international dialogue on important development issues and increase the impact of their research.
IDRC Press Release - IDRC Champions Intellectual Platform for Developing Countries
Since I'm not a librarian, I may not always do the best job of communicating to the library profession. "Libraries need networked services because... it's obvious," would unfortunately tend to be my explanation. Fortunately, Eric Lease Morgan comes to my rescue with a clear and thoughtful posting on Library services and in-house software development.
This essay outlines two opportunities for the library profession: 1) the increased need for library services and not necessarily library collections, and 2) the ability to be leaders in the implementation of services through in-house software development.
I have said "networked services" specifically to distinguish technology-based services from traditional "services to clients". One methodology to handle the steps needed for identifying and creating services is Service-Oriented Architecture (SOA). This leads in my organization to us talking about "SOA services", in order to make things clear. I will caution that I believe it is quite challenging to use SOA without having a pre-existing or creating a co-existing Enterprise Architecture (EA) context for it.
One of the big challenges and questions in the networked services area I think is to what extent you can deliver sustainable services using very lightweight point-solution scripting approaches, and to what extent you need a larger superstructure of service architecture and design.
UPDATE: Also, I think the nature of the services you need to provide is very much linked to the nature of your library. I can imagine a public library providing services like Book Reservation, Book Loan Status, or even e.g. providing free blog hosting for local book clubs, with built-in services to link to the library's holdings and to add reviews and tags to the library catalogue. For an academic/research library, the services should be very much researcher-centric. In fact, I can imagine the library providing much of the support infrastructure for the research workflow. While this may seem like an expansion of the library's role, it's quite logical: if I break research into Discovery (of background information), Reference (to the found knowledge), and Creation (of new knowledge) then the library has a role to play at each stage. Here are some quick thoughts:
As you can see, there is a big role in supporting knowledge discovery, as well as a role closely aligned with traditional archiving - build data, code and paper storage and management into the researcher's workspace, in order to avoid the current issues with attempting to get deposit into repositories after the fact.
My organization has identified networked library services as a key component of our future ("target state"), and we're interested in working with / partnering with other organizations to help further explore, define and implement such services.
while open source software — due to its strong impact on business and on bridging the digital divide — has drawn much attention, it may provide false hopes for the sustainability of openness in other areas of content that need careful examination. The conference FM10 Openness: Code, science and content — Making collaborative creativity sustainable provides a platform for such analysis and discussion, resulting in concrete proposals for sustainable models for open collaboration in creative domains.
The conference will draw on the experience of First Monday as the foremost online, peer–reviewed academic journal covering these issues since May 1996. Not only has First Monday published numerous papers by leading scholars on the topics of open collaboration, open access, and open content in its various forms, it is itself an example of open collaboration in practice: for nearly a decade, the journal has been published on a purely voluntary basis, with no subscription fees, advertising, sponsorship or other revenues. The success of First Monday is demonstrated by thousands of readers around the world, downloading hundreds of thousands of papers each month. This conference celebrates First Monday’s tenth anniversary.
We invite papers for the conference and for a very special issue of First Monday. These papers will be reviewed by a special conference editorial committee. Authors of selected papers will be invited to the conference, scheduled to take place at the University of Illinois at Chicago, 15–17 May 2006. The conference is supported in part by the John D. and Catherine T. MacArthur Foundation (http://www.macfound.org/) and the Open Society Institute (http://www.soros.org/). Other selected papers will be published in a special issue of First Monday, to appear in June 2006.
Papers should address the issues involved in building sustainable models for openness in science, software and content. They can examine technical, sociological, economic/business and legal issues, and can be conceptual or practical in nature.
CFP deadline 6 February 2006
completed papers by 1 May 2006
Seed Magazine, a fairly new (to me, anyway) and good entry into the science magazine field, has entered the science blogging space with the easy-to-remember
ScienceBlogs is the web's largest conversation about science. It features blogs from a wide array of scientific disciplines, with new voices coming on board regularly. It is a global, digital science salon.
ScienceBlogs is powered by Seed Media Group — publishers of Seed Magazine — and is part of the Seed Digital Network, which also includes Seedmagazine.com and Phylotaxis.com. Seed provides technology, hosting and other enabling support for Scienceblogs.com as part of its mission to increase public understanding of science.
via The Commentator
You can check my Weblogs category for more postings about science blogs.
I am far from being expert in these matters, but I know where I stand, which is: compensate creators reasonably, for a reasonable amount of time, allowing for all traditional fair use, first sale etc. rights.
This issue has some particular importance in Canada at the moment as we are having a Federal Election.
The Corporation tries to pull all into private space forever, this is a rational action by it to maximize profits.
I will let more expert voices speak, I do have one note: the "UN" WIPO treaty was just the US advocates of DMCA legislation, thwarted in the US, making international law behind closed doors and then coming back to the US and saying "oh dear, it's international law, I guess we must have a DMCA now". I don't have a good ref on this, feel free to provide some.
I find it interested that the passion of the Canadian "rational technology policy" groups seems to be around copyright (which is certainly an important issue) with, as far as I can tell, no energy devoted yet to fighting electronic voting in Canada.
Anyway, on to some resources and thoughts on copyright and DRM in Canada.
Cory Doctorow has a major article out in Sunday's Toronto Star: Trademark political shenanigans.
Worrying about the Americanization of Canada is practically a national sport, and nowhere is it more prevalent than in politics. Recently, two-term Liberal MP Sam Bulte (Parkdale-High Park) has followed her American counterparts down the path of rewarding her entertainment-industry campaign backers by throwing her support behind U.S.-style copyright laws for Canada.
On Thursday night, just four days before the election, she will host a $250-a-plate fundraising dinner sponsored by the heads of the industry associations for the film, music, publishing and videogame sectors.
Take 1998's Digital Millennium Copyright Act (DMCA), a law that was enacted to implement the UN's WIPO Copyright Treaty. This is the same treaty that Bulte led the charge on trying to implement in the last Parliament with Bill C-60, Canada's own controversial copyright reform bill. (The bill died when the election was called but could be resurrected if the Liberals win the election; Bulte is thought to be a shoo-in for the job of minister of Canadian Heritage, which would make her responsible for copyright laws.)
The DMCA, passed in Congress with no debate, represents a kind of farcical wish list for rights holders who don't want to have to reinvent their business to accommodate the digital age. It lets anyone who claims to be an infringed-upon copyright holder demand the censorship of Internet materials without showing any evidence that any infringement has taken place.
Another form of "self-help" is so-called Digital Rights Management (DRM) technology. These are the technological restrictions put on the media that you buy, such as games, CDs and DVDs, that seek to control how you use works after you buy them.
These DRMs indiscriminately restrict the enjoyment of your lawful property, allowing rights holders to control your private use of media in ways not considered under copyright law. For example, Adobe's eBook technology blocks your ability to copy and paste a quotation, even where copyright law would allow it, e.g. in the course of criticism or in academic research.
DRM technology on DVDs prevents you from watching discs bought overseas in a Canadian DVD player, despite the fact that copyright doesn't give creators the right to control where their creations are viewed after they've been sold. That's why you don't need to leave your Canadian editions of your favourite books at home when you go on holidays in foreign countries.
Bill C-60 contained its own analogue to the Broadcast Flag: a mandate that required libraries to use DRM technologies when loaning out digital documents to patrons. Under C-60's proposal, Canada's tax-funded libraries would have to buy and use onerous restriction software to control how patrons used their digital documents — and their patrons would also have to pay a premium for DRM technologies to read and use these documents.
The outcome of the American approach that Bulte advocates is nothing less than a war between the entertainment industry and the public. In the U.S., thousands of lawsuits have been filed against music fans, including grandmothers and little girls in subsidized housing.
If you want a UK angle, I liked the posting by Kevin Marks: Response to the UK Parliament's DRM Inquiry
Each computers' immanent ability to become any kind of machine and the copying of data that happens as part of this, leads the DRM advocates naturally to the point where they want to outlaw computers, or to take them over by stealth, using virus-like techniques.
The reductio ad absurdum of this is to privilege DRM implementers in law above the owners of the computers on which their software runs, without their effective consent. Sadly, this is exactly what is being demanded by the publishers' lobby.
Slashdot points out yet another issue about DRM, in that it is in a way an attack on your own computer, potentially comproming your security. The Choice Between DRM and Security, quoting a Groklaw article
We are entering the era of ubiquitous and safety critical computing, but the developers of DRM technologies seem to believe that computers are nothing more than personal entertainment systems for consumers. This belief is convenient, because creating DRM mechanisms that respect security, safety, and reliability concerns is going to be an expensive and complex engineering task.
I'd like to mention a side pet peeve of mine, which is that copyright holders are demanding so much money for even brief bits of their content, that is slowing, preventing, or altering DVD releases of TV shows. Or even important documentaries, the pre-eminent example being Eyes on the Prize. (Eventually rescued by donations of over US$850,000.)
There are also ongoing battles around the ability to make copies of digital content. This includes forcing HDCP DRM on digital content, the endless HD DVD format and DRM wars, and digital radio. (As a side note, corporate paranoia leading to increased DRM associated with enhanced digital quality has basically killed at least three technologies so far: DAT, SACD and DVD-Audio.) It gets even worse, as the latest idea is a "broadcaster's right" and "webcaster's right". That doesn't mean "right to webcast" that means "right to prevent anyone else from ever showing anything you transmit, even if you don't own the content yourself".
As luck would have it, there are a wealth of Canadian resources on these topics:
The Globe had a major Centrepiece in the Saturday Focus section about the Gulf Stream and potential impacts on Europe if it weakens or fails. One might have an interesting discussion around it, but it does not exist at all online (as far as I can find). If it's not online, then it can't easily live beyond the minds of those who read it when published - no ongoing discussion, no role to play in an ongoing debate.
In southern England, on the latitude of Labrador, the grass is green. The landlord's cats are padding along a flagstone border by the garden, and morning sun is striking sparks from the spikes of a little palm.
That this scene is possible in January at a point on the globe more than 800 kilometres further north than Toronto is because of a trillion kilowatts of heat released into the air by the Gulf Stream flowing northward west of Ireland. This great ocean current is now in danger of collapse. If the Gulf Stream goes, so does the heat. Things very quickly will get harder for the palms, the landlord's cats, and, I guess, the landlord too.
The ever-popular topic of technology and community. Have we lost the public space? Personally I think TV killed the public space long ago. Just because we could all talk about last night's episode of Friends doesn't mean we still had a useful body politic. And practically as soon as Internet filtering concepts came out, people started worrying about people creating their own personal "Daily Fringe Opinions Reinforced Times" news portals. Anyway, the Globe takes a stab at the issue in Wired Up, Plugged In, Zoned Out
all the creations, news and thoughts of humanity at your fingertips -- making this generation the most plugged-in and informed in history.
But many observers are concerned this tech-savvy revolution could be leading just the opposite way. The filters and search devices used to make all this information manageable, they say, are isolating people into niches fashioned to their particular tastes and beliefs. Instead of going to common sources, whether newspapers or broadcast TV, to get the daily news, users are getting only the "daily me."
It's a phenomenon U.S. historian Christine Rosen has described as "egocasting."