Thursday, September 30, 2010

Week 4 - Muddiest Point

Concerning vector images/graphics: are these images still being displayed using pixels, or is the image itself created by some other means? If it is created by pixels, how is the image able to be displayed and magnified at an indefinitely high quality without pixelation or scaling issues?

Sunday, September 26, 2010

Assignment #2

http://www.flickr.com/photos/15831551@N03/sets/72157624916193203/

I decided to digitize historic photographs of the town where I live (Buckhannon, WV) and the college where I work and where went to school (West Virginia Wesleyan College).

Thursday, September 23, 2010

Week 4 - Comments

http://marclis2600.blogspot.com/2010/09/unit-4-readin.html?showComment=1285298789042#c6339777978878326185

http://emilydavislisblog.blogspot.com/2010/09/week-4-multimedia-representation-and.html?showComment=1285299650263#c7727104535591628798

Week 4 - Musings...

Data Compression - from Wikipedia

Reading this informative article brought to the forefront, for me, the issue of compatibility in computing. Compression is obviously one area that suffers from the potential of systems not communicating properly because of a lack of a standardized language to facilitate compatibility. The concept of compression itself is appealing as a means of making material available without the necessity for huge disk capacities. At the same time, the various means of compression and the lack of definitive standards have led to the proliferation of various file formats and compression types that each require a different piece of software to decode and decompress (i.e. - MPEG 1, 2, 4 compression, to say nothing of the various proprietary file formats: MP4, AVI, Quicktime, etc.)

Data Compression Basics - from DVD-HQ

This is one of those articles that has 'basics' in the title and I'm content to believe it until about the 5th page and I start getting lost. The author, in fairness, tries to simplify things where possible and keeps an informal tone throughout. I made it through Part 1 with at least an understanding in the conclusion of where lossless compression is useful. I feel that section could have been expanded upon (with less time spent explaining the details of the compression algorithms), but that overall the point was made.The explanation of lossy compression on the other hand was explained well early on and my immediate association was to the questions of representation of content vs. representation of the object that crops up in many archival settings. In dealing with digital images, it is important to be aware of the ways in which various file formats and compression algorithms alter images, even if the results are nearly indistinguishable to the layperson viewing the images.

Imaging Pittsburgh... - Edward A. Galloway

Galloway's examination of the IMLS grant program to digitize multiple images across institutions touched on a number of critical themes for librarians and archivists. Primary among these was the critical role of standards and the need to ensure that standards are well-communicated in cross-institutional projects such as these. It seems these institutions didn't come to an agreement on metadata and subject-heading standards until later in the process and that issues persisted through the project as to the nature of how these should be resolved. Another that I noticed was the importance of communication and interaction, especially as it relates to core functions and understandings. The classic library/museum/archive boundaries seemed to pose problems to all these well-established institutions who, presumably, would have some awareness of the issues going into a project such as this. A concluding note: it seems from my experience that the direction of higher education (and obviously IMLS) is to push for outcomes-based assessment - libraries may do well to adopt this type of internal assessment system, though I think much work in training needs to be done for this type of paradigm shift.

YouTube and Libraries - Paula L. Webb

Not a particularly informative article and didn't introduce many new or difficult concepts. This seems like a good primer to a novice librarian with little exposure or experience working with YouTube's interface and with ambitions for greater access. For more advanced users, this article provides only the basest of conclusions that are already well-known or self-evident.

Tuesday, September 21, 2010

Week 3 - Muddiest Point

From our readings, I gathered that UNIX/LINUX is not as user-friendly to design in - nevertheless, if it has as many other advantages as were indicated in class over Windows (compatibility, security, price, etc.), why is it that Windows seems to have such a large corner on the OS market?

Thursday, September 16, 2010

Week 3 - Comments

http://jsslis2600.blogspot.com/2010/09/week-3-reading-notes.html?showComment=1284700484644#c7799175330633986596


http://kel2600.blogspot.com/2010/09/reading-notes-september-13-2010.html?showComment=1284701314098#c6695603595434564287

Week 3 - Musings...

Bill Veghte - 'An Update on the Windows Roadmap'

Having resisted the recent onslaught of new Microsoft operating systems on my personal computer in the past few years and remained with Windows XP, I was heartened to read this e-mail. Some of the information I knew (having worked with and confronted problems at work on machines loaded with both VISTA and now 7), but I was less clear on the date when Microsoft would cease support of Windows XP. In many ways it makes sense, as the operating system will have reached its 13th birthday at the date service stops - a degree of longevity not frequently seen in operating systems today. Still, I was a bit disturbed by the fact the Windows seemed to 'rush' into releasing VISTA with its performance and compatibility issues. The old mantra of, 'if it ain't broke, don't fix it' seems appropriate here, as VISTA was universally panned for its shortcomings. When the previous operating system performs better than the 'latest and greatest,' there's a message there for Microsoft R&D.

Wikipedia - Mac OS X

My experience with Macs is admittedly limited - the last time I used one was around the initial release of OS X in 2001 and I don't remember much about it. Reading over this informative description brought some of it back, but also confused me a bit because of the lack of a common frame of reference. Understanding the underlying system technology was at times difficult until I read over the UNIX article. Even then, my understanding of what a kernel is, as well as some of the functions of the Mac systems (Carbon, Cocoa, etc.) weren't easily understood without further research and supplemental reading.

Amit Singh - What is Mac OS X?

After reading the wikipedia article on OS X and finding myself slightly confused, I was dumbfounded when I started looking over this article. The degree of technical knowledge needed to understand a majority of what was written in here seemed overwhelming. I started understanding what was being talked about around the time Aqua was introduced. I'm struggling to come up with something intelligent to say about this article aside from the obvious fact that, despite Apple's ability to make a clean and simple interface, the back-end of this particular OS appears as confusing if not more-so than understanding Windows or any other operating system on the market today.

Machtelt Garrels - Introduction to Linux

An eye-opening introduction to an operating system about which I knew virtually nothing. I hadn't associated Linux with the open source movement and it quickly became apparent that the potential for technological growth and development here is immense and has already been well-exploited. I was less than heartened to read that it isn't the most user-friendly interface to work with, but it seems that the supporting community built up around it as well as its flexibility and compatibility more than make up for difficulties in working with it.

Week 2 - Muddiest Points

1) I'd like to know more about the difference between why and when RAM is used vs. ROM - how is the 'speed' of each of these measured? It was mentioned that ROM is used for permanent data and instructions - can you give some more examples of how this would work and what specific functions each of these carries out?

2) When we talk about binary code and bits of information, how is it that the computer knows what values to assign to each 1 and 0? For instance, it was mentioned that 1s and 0s in digitizing an image allow the computer to assign values (different shades of black, white, color, etc.) to each of these. How is this accomplished? What component of the computer is responsible for assigning those values?

3) When we say that the read/write head electronically 'writes' to a hard drive platter, how is this accomplished? In other words, what process is used to write the information to the platter? Is it magnetic, a physical connection, etc.? I'm inclined to think it's not a physical connection, since there has to be a buffer, but I'm unclear on how the data is actually 'written' to the disk.

Thursday, September 9, 2010

Week 2 - Musings...

Personal Computer Hardware: Wikipedia

An interesting and insightful look at the inner workings of computers. I work with library patrons (college students and faculty) everyday who have technology problems with laptops - I realized upon reading this how wantonly and arbitrarily they (ok, I'm guilty of this myself) tend to assign blame to certain computer parts when something goes wrong with no real basis for it. I frequently hear, "Oh, I think the hard drive is going bad" as the default excuse for why any number of problems occur. After reading this article entry, I've realized that not only is the hard drive more than likely not the issue a majority of the time, but that the motherboard probably is (yet I've rarely ever heard someone blame their motherboard as the cause of their problems). Computers have become so entrenched in our lives and in society, yet reading this causes you to realize how very little we tend to know about the true workings and make-up of these things that we use on a regular basis.

Moore's Law: Wikipedia

What jumped out most for me about this article is the correlation (albeit not substantiated by a traceable citation) of Moore's Law with obsolescence. In an age where the latest supposedly = the greatest and where there is a constant push to develop newer and faster computing resources, I anticipate that potential for our developed capacity to outpace our ability to utilize it. I see this as a problem more generally in information technology, where the exceptionally fast pace of development makes it very difficult for laypeople (let alone information professionals) to stay current and relevant. The potential is there, I think, for a new kind of digital divide to develop, where obsolescence pushes people out of the technology market who, despite their best efforts, simply can't stay current in the ever-increasing pace of technological development because of financial or technical considerations.

Computer Museum Website

Despite the fact that Moore's Law is said not to universally apply to all concepts in computing, a look through the Computer History Museum's website really gives one pause to take stock of how quickly and how far the course of computing has come since its inception. From the crudest of possible beginnings, the history of computing seems to follow along the lines of exponential growth and development. Since personal computing really took off in the 1980s to the present, it's possible to see how much development seems to be taking place. The Computer History Museum's website helps put much of this development into perspective while providing very helpful context to understanding the development of computing resources that we often take for granted today.

Thursday, September 2, 2010

Week 1 - Muddiest Points

If, as the OCLC report indicates, individuals are becoming more self-sufficient in their information seeking strategies while at the same time adopting methods of information retrieval that focus on reducing information into easily retrievable (if decontextualized) fragments (think smart phones), is it possible that we've invested too much power in the content providers to control access? Doesn't the potential exist for greater censorship of information according to the ends of the content providers?

Week 1 - Musings...

C. Lynch - "Information Literacy and Info. Technology Literacy..."

Lynch's writing stands out for me as a rushed, even incomplete attempt to formulate thoughts on an extremely broad topic in too small a space. Some of this could be attributed to the early date of the work and the emerging understanding of information technology literacy at the time. Lynch attempts to shift the paradigm of technology education away from exclusively skills-based pedagogy to a mix of both broader theoretical context as well as technology skills. While his push for broader theoretical education of infrastructure and systems is well-made, his analogy of teaching new software tools, understanding how to use existing tools, and ability to problem-solve with specific IT tools (software, hardware, etc.) seems to hold the same potential to 'date' and become obsolete as his analogy of 'touch-typing.' With the increasing speed of technology development, it seems that the ability to teach the 'learning of new tools' has the potential to date even more quickly than skills Lynch seems to discredit. Lynch manages to encapsulate the importance of interconnected systems and the need for pedagogy around those areas, but in general fails to entirely take into account the full nature of obsolescence as it relates to teaching of new technology.

Vaughan, Jason - "Lied Library @ four years..."

Vaughan's case study is revealing perhaps of the 'right' way for libraries to try to handle a technology refresh, provided you work in a library which appears to have unlimited institutional support and financial backing. Certainly the case study makes the important point that planning is critical to the implementation of new technologies and helps illustrate that planning for technology must be a collaborative process. If unilateral action is taken, the potential always exists for someone's needs to go unmet. It seems UNLV was mainly able to avoid this pitfall. At the same time, I was struck by the mentality I picked up on through the piece that the "latest" technology automatically equals the "greatest" technology. While UNLV lucked out on getting brand new computers with the Windows XP operating system, imagine how this article may have differed in tone if the technology refresh took place just a few years later with the release of Windows VISTA. Considering the widespread disappointment around the computing world with the reliability of that operating system, the assumption that new=best would be seriously called into jeopardy. For other libraries considering technology upgrades, the article (if nothing else) reinforces the importance of patience and careful testing to ensure that new products meet the reliability and functional needs of the institutions which they will serve.


OCLC - "2004 Information Format Trends: Content, Not Containers"

OCLC's report succeeds in identifying trends in technological development (as it relates to cultural integration and information distribution) which, primarily, have held true and which have been essentially magnified since the report was released. Especially relevant is its observation of the important impact of mobile electronic devices and their destabilizing influence over the hegemony of the computer in information content delivery. What would have been interesting to see the article address is the potential impact of information quality as the means of delivery become smaller and the expectations for information become more fragmented and less contextualized. Information sought on a cell phone or mobile electronic device is meant to usually be provided quickly and easily with a minimum of unnecessary material. With this parring down of information to the 'lowest common denominator,' it will be interesting to see if libraries (which are working to tailor their information retrieval methods to 21st century information seeking behavior) are able to maintain the quality and context of information sought while making the search interfaces relevant to a highly mobile and generally impatient generation of information seekers.