|
This topic comprises 3 pages: 1 2 3
|
Author
|
Topic: My Perfect Computer
|
Ian Price
Phenomenal Film Handler
Posts: 1714
From: Denver, CO
Registered: Jun 99
|
posted 10-18-2000 12:47 AM
This topic is all Ken Fong’s fault.My perfect computer: The operating system would be invisible. It would look like the Apple G4 Cube. It would use something like the Apple 22” Cinema display but would only cost $400.00 per monitor. It would support up to 3 monitors. The connection would be digital, like the new Apple Cinema display. It would support other screens of ever increasing dimensions as long as they have the same connector. The connections would be something you could make at home like RG6 or CAT5 wire. The sound output would be Dolby Digital and DTS. The “hard drive” would be non-volatile RAM. The “operating system” would be on an E-PROM so the boot-up is instant. The computer wouldn’t care if it were left on 400 days a year and 8 days a week. In other words, the buffer and temp files wouldn’t clog it up and need the computer to be turned off to clear them. The “hard drive” would be large enough for all of my music and 50 hours of stored television, like the Replay TV box. The DVD would record. It would contain my satellite receiver. The keyboard would be wireless. Although it would have a remote mouse, it would also be voice activated. It would be able to have remote monitors and speakers much like a Bose sound system. It would be my phone system, and voice calls would be free. Broadband would be able to transmit proper video signals. My digital camera, cell phone and personal sound system would be integrated into my glasses, wirelessly. And they wouldn’t weigh much. And of course, I wouldn’t need my glasses to correct my vision. What would your's be like?
| IP: Logged
|
|
|
|
|
|
|
Evans A Criswell
Phenomenal Film Handler
Posts: 1579
From: Huntsville, AL, USA
Registered: Mar 2000
|
posted 10-18-2000 02:32 PM
Y'all are right about computers becoming obsolete quickly. It's funny how the equipment keeps improving speed-wise but the theory of computing hasn't changed all that much lately. We can do the same old things faster.Look at the clock speeds found in most PCs (near top of the line) in common use each year: 1981: 4.77 MHz 1984: 6 MHz 1986: 8 Mhz 1988: 12 MHz 1989: 33 MHz 1991: 50 MHz 1992: 66 MHz 1994: 100 MHz 1996: 200 MHz 1997: 266 MHz 1998: 400 MHz 1999: 600 MHz 2000: 1000 MHz But, on the other hand, take the topic of formal languages and automata theory. Look at the 1969 and 1979 editions of a standard textbook, such as the ones by Hopcroft and Ullman, and a lot of the material is the same. I took a class in 1995 and we used the 1979 edition. It just hasn't changed much. Look at NP-completeness. We used a 1979 book for a class on this topic in 1999! It seems the theory concerning algorithms in general, and what we can and cannot do efficiently (in polynomial time), or at all has changed very little in the past 20 or so years. In fact, some of that theory was in place in the 1930s and 1940s before modern computers were even being built. I have a coworker that has many ideas similar to the ones in the list that started this thread (operating system being invisible and getting rid of filesystems and hiding implementation details from the user). I just don't think a lot of the things will ever happen. I am currently working on a Ph.D. in computer science (I have an M.S. degree in computer science and a B.S. in mathematics). For a lot of things to happen, I think there need to be some advances in the underlying theory of computing. Speed is nice, but I have seen many cases where improving the algorithm (if possible) can make a program run at acceptable speed even on a slow computer. Faster computers have caused software developers to get lazy, and the code they write to become larger and more "bloated", memory hungry, and disks space hungry than ever before. Until the focus is back on efficiency and better algorithms, the trend will continue. Operating systems have changed quite a bit during the past 15 years. I've used everything from old second-generation operating systems like MVS for IBM mainframes to VMS (VAX systems) to UNIX (many platforms) to DOS, OS/2, and Windows for PCs. I tended to like UNIX the best, although you have to run Windows to be able to run the most popular software. I have a 933 MHz Pentium III here with Wondows 2000 and it does most things I want to do really well, but this machine cannot do multiple things at once very well without killing the user interface response time. Silicon Graphics workstations running UNIX were much better at that (running background jobs without killing performance for foreground jobs). Although Winodws 2000 is great, there will always be device driver pains and functionality that you have to add by installing 3rd-party software. I just the shell in Windows 2000 were a full-featured UNIX shell. I really wish Windows 2000 had UNIX underneath supporting it. It would be a much better system if it did. Well, I guess I've yakked enough for now. Evans
| IP: Logged
|
|
Scott Norwood
Film God
Posts: 8146
From: Boston, MA. USA (1774.21 miles northeast of Dallas)
Registered: Jun 99
|
posted 10-18-2000 02:47 PM
There actually are some clones of Unix shells (bash and csh, at least) that run on Win32, and which give the user access to most of the standard GNU tools (gcc, gdb, etc.) in a Unix-like environment. I suppose that it's a nice concept in theory, but all this stuff seemed pretty half-assed to me, since this does nothing to improve the overall stability of the underlying OS and since a lot of the command-line tools that would be useful for scripting don't exist in NT (e.g. you can't write a script that adds new users by appending lines to /etc/passwd, and so forth). As far as I'm concerned, every OS ought to have a full command-line interface to every tool, regardless of whatever GUI might be slapped on top of it. There are just too many situations where one needs to access a machine remotely or automate a process using scripts for the command-line interface to be ignored; GUIs are nice, but they suck over low-bandwidth connections and for repetitive tasks.
As for multitasking, I've been pretty impressed by how much performance is gained by adding a second processor. This SS20 that I'm typing on now felt pretty slow with a single 50Mhz processor; when I added a second 50Mhz processor, it became amazingly zippy; it can easily deal with me compiling something, listening to MP3 files, formatting something with LaTeX and looking at web pages all at once. I don't know if Win2k handles multiple processors as well as Solaris (I tend to doubt it), but it would be interesting to see how much the performance improves.
| IP: Logged
|
|
Evans A Criswell
Phenomenal Film Handler
Posts: 1579
From: Huntsville, AL, USA
Registered: Mar 2000
|
posted 10-19-2000 01:41 PM
Scott, you're right about the second processor helping a lot with multitasking. In our research center, we have a couple of machines that have two 400MHz Pentium III equivalents running NT. The machine in my office is a 933 MhZ Pentium III. When I run a job in the background, like untarring or ungzipping a 75 megabyte SSM/I data file, my 933 MHz machine becomes practically unusable for anything else, but the dual processor 400MHz seems unaffected by it and remains usable and responsive. We have, for years, been using multiprocessor SGI servers, one of which has 8 CPUs, which is heavily used, but remains very responsive and never seems to get bogged down. (This particular machine runs IRIX, which is SGI's UNIX). I've always found it interesting that IRIX machines seem to remain responsive with a backgound task running, even if the machine has only one CPU. Single processor PCs tend to not handle background processing well at all without seriously impacting the response of the GUI. Evans
| IP: Logged
|
|
|
|
Randy Stankey
Film God
Posts: 6539
From: Erie, Pennsylvania
Registered: Jun 99
|
posted 10-20-2000 07:35 PM
One question...What does MMX actually do? It took me months to find out. The only answers I could get was, "It speeds up graphics..." And, the answer is... MMX shuts down the floating point processor and turns it over to graphics processing. Problem is that you can't do floating point math while MMX is on. You can only do integer math.
| IP: Logged
|
|
|
|
|
|
All times are Central (GMT -6:00)
|
This topic comprises 3 pages: 1 2 3
|
Powered by Infopop Corporation
UBB.classicTM
6.3.1.2
The Film-Tech Forums are designed for various members related to the cinema industry to express their opinions, viewpoints and testimonials on various products, services and events based upon speculation, personal knowledge and factual information through use, therefore all views represented here allow no liability upon the publishers of this web site and the owners of said views assume no liability for any ill will resulting from these postings. The posts made here are for educational as well as entertainment purposes and as such anyone viewing this portion of the website must accept these views as statements of the author of that opinion
and agrees to release the authors from any and all liability.
|