000webhost

Web hosting

Monday, May 21, 2012

Temperamental Television

Until recently I didn't realise just how sensitive DTV (Digital Television) actually was. My experience tells me that an amplifier is highly recommended (unless your dongle/device/set top box is exceedingly good). The thing which has made me particularly curious though is the impact of local electrical appliances (such as power boards, heaters, and even networking devices. Removing or swapping particular devices can have an immediate impact on signal strength/integrity.) and the connections between the tuner, antenna, amplifier, etc... While I admit that some of the equipment in my setup could be of a higher standard the tolerances that we're talking about are borderline absurd. Changing the angle of connections is enough to change the signal strengh/integrity substantially. Moreover, signal integrity issues at certain points are extremely difficult to debug without specialised equipment (I've seen some digital TV strengh metres being sold but I haven't tried one as yet and the cost outweighs the possible gains. Admittedly, there are some signal strength measuring systems in the software itself but this is not ideal as it can not go into an arbitrary position in the pipeline.) and the only real feedback that you get is how 'choppy' the sound/picture is. I've found that moving the amplifier power point on to a less noisy circuit can be extremely helpful as well as changing the angle of the cable the being used to carry the signal into various devices (the amplifier is particularly susceptible to this problem and I've found it best to lay it on a flat surface). I may experiment with Microwave based transmitters (2.4/5GHz range) later on but early indications/reviews say that cable may actually be better. I've also been thinking about using an adapter to carry the signal over a non-coax type cable (or a medium that is less suspectible to issues related to cable bending) to hopefully reduce the impact of these problems but once again its a cost/benefit issue and I've seemed to have found a viable solution as is.

I've figured out that some of my recent problems with LibreOffice may also be due to issues with regards to externally inserted objects (such as images). I've been working on a new document (on 'Internet and Computer Security') and its 225+ pages/67K + words and there have been no random crashes as I've previously experienced when working on larger documents. I guess I'll have to write the text before hand, and add other objects in now from now on (at least until they fix the problem. This should also get me around another problem that I've found when editing text and there are images in the document (they don't move perfectly correctly with the text when cut/pasted.). The other alternative is to switch to another system of document management...

Have a theory about some connectivity issues I've been having lately. While playing around with MTU values has resulted in success I've also noticed something else. Sometimes, there seems to be a noticeable delay at certain critical points as though the traffic almost as though it is being buffered. Also, if there is other web based activity simultaneously this will help to get around the problem of stalls/stops at these particular points. Suspect there may be a timeout value that I may be able to tweak in my browser to help smooth out this intermittent problem (I've been dealing with it by dynamically shifting MTU values (depending on the circumstance) along with using HTTP/206 partial download capabilities but am thinking of building something more robust/automated/finding a better configuration for a more elegant solution.).

about:config
opera:config
http://100pulse.com/http-statuscode/206.jsp

- as usual thanks to all of the individuals and groups who purchase and use my goods and services
http://sites.google.com/site/dtbnguyen/
http://dtbnguyen.blogspot.com.au/

Friday, May 18, 2012

Fujitsu Stylistic ST4120 Recovery without Recovery Disks

Someone recently brought me a laptop which was suffering from 'performance anxiety'. There were some obvious issues. File system fragmentation, low spec (P3 933MHz/768MB RAM/30GB HDD), power management issues (it was underclocking itself to a third of its maximum clock speed even when it was plugged into the local AC point), and general sub-standard upkeep (a lot of unrequired crud was installed on the system). I decided to re-install the system using the existing recovery partition by running D:\RECOVERY.EXE

Somehow, the original owner had managed to delete a critical file though (D:\INSTALL\WATERMRK.JPG) which was required for the program to run. After creating this file the program started but refused to run due to the recovery image being older than the currently installed operating system.

I initially tried to get around this problem by uninstalling SP3 (I discovered early in the piece that physical recovery disks were unavailable and the documentation said that you required a special boot disk to access the necessary files on the recovery partition. I didn't want to go with a clean XP Professional install because it woud be cleaner and more elegant if I could just get the recovery system to run.)(The process is similar to removing a Hotfix but more prolonged for obvious reasons).


However, I discovered that this was not enough (I found out that the recovery image was based on Windows XP Tablet PC Edition SP1) and that I needed to go further. Once SP3 was done removing itself I discovered that it also removed a critical uninstall file though. I tried to get around this through registry modifications of keys from the following two folders (I could only guess how the program was getting its version information because I couldn't load tools on to the system due to the system being so slow and unstable after removing SP3).

My Computer\HKLM\SOFTWARE\Microsoft\Updates\Windows XP\
My Computer\HKLM\SOFTWARE\Microsoft\Windows NT\Current Version\

Obviously, this was unsuccessful so I tried using a standard XP boot disk to boot to, 'Recovery Mode' (its three taps using the digitiser or the F12 button if you have a keyboard to activate the boot menu option). I noticed that changing into the various directories ("Access is denied.") and running D:\RECOVERY.EXE ("Command is not recognized.") was impossible though even though it was a FAT32 based filesystem partition and seemed to use standard PE32 executables on initial view (I expected to at least see the standard, "This program cannot be run in DOS mode." string and even saw this string in the initial header of the executable when viewing it via a text editor.).

user@system:/media/sdd1$ file RECOVERY.EXE
RECOVERY.EXE: PE32 executable for MS Windows (GUI) Intel 80386 32-bit


It seemed obvious that these were not standard executables though. I then tried using a Win 7 32-Bit recovery disk (Windows 7 recovery disks are temperamental with regards to the base system and the disk version (32-bit disk should be used with 32-bit base system) if you're curious). This allowed me to move into the required directory and also run D:\RECOVERY.EXE and kick off the recovery processes. Even though it seems possible or obvious that there may have been some co-operation between Microsoft and Fujitsu it looks as though the recovery process isn't quite as clean as it should be (progress meters aren't perfectly aligned, strange screens for particular processes, and so on). Nonetheless, it did what was required (it deleted/copied files to/from rather than formatting the existing Windows partition interestingly) and copied and ran the correct Windows installation files.

Some notes regarding this platform include the following:
- possible design flaw? On my particular system the power cable was quite simply too loose.
- compared to modern imaging techniques/systems (Norton Ghost) the restoration process was extremely slow.
- Windows Security button is the top left in portrait mode if you don't know. Hold it down for three seconds in order to activate Ctl-Alt-Del sequence on the Windows login screen.
- the left hand side of the screen in portrait mode can warm/heat up substantially as the hard drive is located directly underneath.
- even though it is extremely stylish, it seems to have a tiny battery and a high weight though this is synonymous with Windows tablets of this vintage.

- as usual thanks to all of the individuals and groups who purchase and use my goods and services
http://sites.google.com/site/dtbnguyen/
http://dtbnguyen.blogspot.com.au/

Tuesday, April 17, 2012

Diverging Convergence

The 'Convergence Effect' report has become far larger than I had originally intended. Part of this was due to feedback which indicated that, "more detail" would make the document more useful, part of it has to do with me thinking of other ideas, and part it has to do with me working on a few side projects (some are related to 'Security') and this document provides me with a good outlet for some of my notes.

Another reason is that I've discovered that LibreOffice currently has some issues dealing with large documents (I'm around the 210 page/61000 word mark) that aren't in the native file format (Switching to working in the native 'odt' file format seems to work well but for reasons of compatibility I'd like access to other formats as well though.). The side effects of this can be data loss, inexplicable crashing, etc...

For these reasons, I will be splitting the 'Security' chapter (The new document will be more 'technically orientated' and will take offensive and defensive positions into consideration.) into a separate document (For reasons of coherence, I will maintain maintain much of the current content/state of the 'Convergence Effect' report.) and will consider these projects as largely, separate entities.

As an aside, I recently discovered a bug with cup-pdf which means that partial page loads aren't well supported under certain web browsers. I've worked around the issue for now by switching to a different browser.

I've also had some feedback that the following is a bit complex, http://dtbnguyen.blogspot.com/2011/03/eeepc-recovery-without-recovery.html
Since I recently damaged my recovery disk I've concur that the article can be a little difficult to read for someone who has never tried it before. Below is a simplified version:

1) Get 1005P recovery disc/image
2) Copy/extract contents to an 'arbitrary folder' which will be used to build a new image
3) Use ImgBurn to extract boot sector from recovery disc/image
4) Get recovery
5) Use a WIM utility to split the WIM image obtained from your 1015PD into sub 4GB files
6) Rename WIM files to asus.wim, asus2.wim, asus3.wim, etc... and drop them in the root directory of your 'arbitrary folder' where you stored the contents of the recovery image
7) Use ImgBurn to use boot image from disc/image on new recovery image
8) Create image
9) Write the image to an optical disc
10) Test whether it boots by using VM software such as VirtualBox or VMWare

Friday, March 30, 2012

Automated Research

I never thought that I would be writing about this so soon. Apparently, the US Government announced an initiative  today/yesterday (depending on your location) that they would be pursuing an initiative into 'Big Data' and a form of automated research that I had been exploring in my 'Convergence' document (still working on it if you're wondering). While its certainly not a guarantee of success in this area I believe the implications may be far wider than any of us may ever imagine. Having algorithms which are capable of searching for masses of data, looking at it, extracting relationships and possibly other insights into it could result in an avalanche of human knowledge and discovery.

http://news.cnet.com/8301-11386_3-57406484-76/why-science-really-needs-big-data/
http://www.whitehouse.gov/sites/default/files/microsites/ostp/big_data_press_release_final_2.pdf

### Start Quote ####
OBAMA ADMINISTRATION UNVEILS BIG DATA INITIATIVE: ANNOUNCES $200 MILLION IN NEW R&D INVESTMENTS Aiming to make the most of the fast-growing volume of digital data, the Obama Administration today announced a Big Data Research and Development Initiative. By improving our ability to extract knowledge and insights from large and complex collections of digital data, the initiative promises to help solve some the Nation's most pressing challenges. To launch the initiative, six Federal departments and agencies today announced more than $200 million in new commitments that, together, promise to greatly improve the tools and techniques needed to access, organize, and glean discoveries from huge volumes of digital data. In the same way that past Federal investments in information-technology R&D led to dramatic advances in supercomputing and the creation of the Internet, the initiative we are launching today promises to transform our ability to use Big Data for scientific discovery, environmental and biomedical research, education, and national security, said Dr. John P. Holdren, Assistant to the President and Director of the White House Office of Science and Technology Policy. To make the most of this opportunity, the White House Office of Science and Technology Policy (OSTP) in concert with several Federal departments and agencies created the Big Data Research and Development Initiative to: Advance state-of-the-art core technologies needed to collect, store, preserve, manage, analyze, and share huge quantities of data. Harness these technologies to accelerate the pace of discovery in science and engineering, strengthen our national security, and transform teaching and learning; and Expand the workforce needed to develop and use Big Data technologies.
### End Quote ####

- as usual thanks to all of the individuals and groups who purchase and use my goods and services

Thursday, March 29, 2012

Convergence Effect

If you've been watching this blog you may have noticed that there hasn't been a lot of activity lately. Part of this has to do with me working on other projects. One of these includes a report that I call the "Convergence Effect" which is basically a  follow up of "Building a Cloud Computing Service". If you're curious, both documents were/have been submitted to various organisations where more good can be done with them. Moreover, I consider both both works to be "WORKS IN PROGRESS" and I may make extensive alterations without reader notice. The latest versions are likely to be available here:



Convergence Effect
 
ABSTRACT
A while back I wrote a document called "Building a Cloud Service". It was basically a document detailing my past experiences and details some of the issues that a cloud company may face as it is being built and run. Based on what had transpired since, a lot of the concepts mentioned in that particular document are becoming widely adopted and/or are trending towards them. This is a continuation of that particular document and will attempt to analyse the issues that are faced as we move towards the cloud especially with regards to to media and IT convergence. Once again, we will use past experience, research, as well as current events trends in order to write this particular report. I hope that this document will prove to be equally useful and will provide an insight not only to the current state of affairs but will provide a blueprint for those who may be entering the sector as well as those who may be using resources/services from this particular sector. Please note that this document has gone through many revisions and drafts may have gone out over time. As such, there will be concepts that may have been picked up and adopted by some organisations (as was the case with the "Cloud" document with several technologies) while others may have simply broken cover while this document was being drafted and sent out for comment. It also has a more strategic/business slant when compared to the original document which was more technically orientated.

- as usual thanks to all of the individuals and groups who purchase and use my goods and services
http://sites.google.com/site/dtbnguyen/
http://dtbnguyen.blogspot.com.au/

Tuesday, February 14, 2012

Random Shuffling

I've recently been using a neglected version of a tool that did not possess randomisation capabilities. I decided to get around this problem by adding/creating a random shuffler so that subsequent results could not be as easily detected by automated systems. Some of the algorithms which I examined included the following.

http://en.wikipedia.org/wiki/Shuffling
http://www.codinghorror.com/blog/2007/12/shuffling.html
http://amazoninterview.blogspot.com.au/2007/05/card-shuffle-algorithm.html
http://www.codeguru.com/forum/archive/index.php/t-339308.html
http://tekpool.wordpress.com/2006/10/06/shuffling-shuffle-a-deck-of-cards-knuth-shuffle/
http://discuss.fogcreek.com/joelonsoftware/default.asp?cmd=show&ixPost=178050

- as usual thanks to all of the individuals and groups who purchase and use my goods and services

Butter Fried Potatoes With Bacon Bits, Sour Cream, and Spring Onion

There used to be a dish at the Pancake Parlour that I took a particular liking to. Over time, I realised that there were 'economic' issues pertaining to this dish and I also thought that there were some modifications (basically changing the balance by using more of certain ingredients) that could be made to make for a better tasting recipe. Below are the results of my experimentation.

Ingredients
- butter
- olive oil
- bacon
- onion (optional)
- potatoes
- salt
- sour cream
- grated cheese
- spring onion

1) Brown/fry bacon bits or diced pieces of bacon in pan/skillet.
2) Remove from pan/skillet and place on plate (drain oil if desired using paper towel).
2) Chop potatoes/onions into cubes.
3) Use a combination of butter/oil (you can use butter only but I find that adding a little oil helps to change the burning point of the fat mixture which means that you can quick more quickly without having to worry as much about burning) to brown/fry the potatoes (if you prefer bigger potato chunks than you may want to boil them first to aide the cooking process). Note that even if you slightly burn the potatoes you can often just add another knob of butter and use slow, lower temperatures to 'fix the error'. Obviously, cooking it slowly at a lower temperature will soften the potatoes while cooking quickly at a higher temperature will make the potatoes crispier. Place onions in later because they cook more quickly.
4) Remove potatoes/onions from pan/skillet and place on plate (drain oil if desired using paper towel).
5) Drop grated cheese on top of potatoes/onions/bacon and put into microwave for 30 seconds to melt it or else use grill.
6) Salt to taste and use sour cream and chopped pieces of spring onion to garnish.
7) Goes very well with lemonade. Enjoy.

- as usual thanks to all of the individuals and groups who purchase and use my goods and services
http://sites.google.com/site/dtbnguyen/
http://dtbnguyen.blogspot.com.au/

Wednesday, January 4, 2012

Personal Backup Solutions

I recently decided to implement an automated regime of backup (I tended to favour manual backups previously quite simply because there wasn't that much to backup). Below are some of my research notes.

BackupPC

Usability issues obviously that are similar to some of the issues that I've been facing working on a personal project. It feels like they've done a one to one translation of what's in the configuration file to the web interface and even though it works well it just feels a little rough around the edges. My expectations of user interface design is such that in most cases (especially consumer class applications) software should not require you to read a manual. In this case, though it seemed as though the some of the labels were confusing/contradictory and the only way to debug was to resort to the CLI. Setup was simple. Install using repo.

- chkconfig backuppc on
- service backuppc start
- service httpd start
- cd /etc/BackupPC
- htpasswd -cmb apache.users backuppc backuppc
- /etc/BackupPC/config.pl is actually a valid Perl file. Configure as required
- su -s /bin/bash backuppc
- ssh-keygen -t dsa
- ssh-copy-id -i .ssh/id_dsa.pub root@host.domain.com

It uses DNS to resolve 'viable hosts' and NetBIOS multicast thereafter. If all else fails though there is a fallback option to change the resolution mechanism via the config file via changes in parameter for the 'nmblookup' command. May need to change MAIL environment variable since by default 'backuppc' account has no configuration files and environment variables setup (I had to because I switched from ~/Maildir to /var/log/mail/*). Use /etc/aliases to forward email to another address. Documentation needs a bit of work. Some parts are skimmed over while others are quite verbose. Had some protocol mismatch issues when backup process was initiated. rsync man page indicated that it may be related to configuration files that may have been located (ssh remotehost /bin/true > out.dat to debug. If it contains non-zero material than obviously there are issues that need to be fixed. You see an error relating to only a certain amount of data being able to be received in the /var/log/BackupPC/* log file/s which is also viewable in the web interface.)

Massive performance issues on an i3 with 4GB once the backups started. The desktop environment actually began to suffer significant latency issues with the mouse cursor actually skipping halfway across the screen a number of times. Had to kill process eventually. Looked at other options such as using a different protocol and changing protocol (other than rsync). Thought about using a solution based on cpulimit, http://cpulimit.sourceforge.net/ that I built a while back (to deal with a similar issues with bacula and other pieces of software) which would basically act like an ABS brake on CPU utliisation and also automatically changing priority for the process via scripts. Further research indicated that this issue has been alleviated or fixed in subsequent revisions of BackupPC though.

Bacula

Prefer this solution over others because while its not perfect its still not a full blown backup solution that can can be unwieldly to deal with. I remember using bacula and there used to be some inexplicable errors in the database catalogue as well as some backup failures that couldn't be explained without delving overly deep into logfiles. Over time I figured out how to deal with them and achieved a perfect backup schedule but to be honest I just wanted a guarantee to know that it would work. One thing I did like about it though was the bconsole CLI interface. Single point from which to deal with mount/unmount/restore/backup of data.



Lightweight Options

Considered other lightweight (and even desktop) options such as rdiff-backup/backintime (basically a bunch of scripts) but a bit unwieldly and also didn't have the logging and diagnostics that BackupPC/bacula and other systems had.

Cloud/Filesystem Options

Have thought about cloud based and filesystem based solutions but have backed away for security/bandwidth reasons and would prefer to not to rely on the filesystem only.


Amanda

Remember Amanda from a while back. Had half configured it previously (basic setup in an experimental environment for possible use in production). This time I decided to do a more complete setup with 'virtual tapes/slots' in a 'virtual multi tape changer machine' setup. Installed using repos and copied relevant xinetd.* file to xinetd.d and ran the following as indicated in crontab sample file.

amcheck -m DailySet1
amdump DailySet1

Had "amanda client 10080 ack timeout fedora" errors. Packaging was slack. Provisions weren't made in xinetd.d/* files in order to properly locate amindex and other file/s causing port 10080 service not to be started. Perhaps it was just the 64-bit version?


Need to create 'virtual tapes/slots' under '/var/amanda/vtapes/slot?'
/dumps/amanda used as temporary storage prior to dumping to (in this case virtual) tape.

amtape
amlabel

  
/var/lib/amanda/.amandahosts works in a similar way to .rhosts bypass file to control who and which servers can backup/restore.


'strings /var/amanda/vtapes/slot0/* | less' gives you
'dd if=* bs=32k skip=1 | gzip -dc | tar -xpGf -'

Not pretty, even if you're doing it the 'proper way'.

amrecover
amrestore


As an aside, I restored to the /tmp directory. Somehow the permissions were erroneous which led to issues with Gnome (dealt with by setting correct permissions on /tmp directory). All the more reason to setup a seperate restoration area.


Logging layout could be streamlined. Spread out over a many different files and directories. Makes it easier to spot a particular time frame but complicates things. 'ls -al/multitail' are your friends here.


Long in the tooth and it shows. However, there does seem to be an effort to modernise judging by the website/Wiki. Zmanda (updated version of amanda with a web based management console) should definitely be at the back of your mind if you ever think of about using amanda.





Saturday, December 10, 2011

Open Source Routing Research Notes

If you've ever used Vyatta before you've probably noticed its very Cisco-ish syntax even though it clearly has a Linux heritage. I've been working on a project that has required a better understanding of how this is ultimately achieved. Quagga, Bird and Xorp came into my sights after a preliminary search as well as a number of other 'younger' projects.

Initially, Bird/Xorp configuration seems to be a lot less readable than Quagga but simpler in that it relies on only one file. Edge to Quagga, then Bird and finally Xorp. Research indicates that Xorp may have been the default open source routing engine prior to Quagga. Whitepapers suggest that open source routing software on commodity hardware is able to achieve similar or superior speeds/performance to that of proprietary solutions at a fraction of the cost. Learning curve is reduced by use of similar syntax of Cisco/JUNOS CLI and of course open nature of routing protocols. Documentation for all options seem to be reasonable.





"zebra is an IP routing manager. It provides kernel routing table updates, interface lookups, and redistribution of routes between different routing protocols.", http://www.quagga.net/docs.php

# and ! used to mark comments in configuration files.

telnet direct to 2601 (telnet to higher ports to configure/monitor the other protocols) and configure/monitor using a Cisco like CLI interface. Like Cisco/JUNOS CLI interface, modes (enable, global config, interface, etc...), commands (show, no, router, etc...), and auto-completion of commands are also supported.

Most protocols major protocols catered for except for proprietary ones such as IGRP and EIGRP. For example, RIP (v1/v2), RIPng (handles IPv6),  OPSF (v2/v3 which handles IPv6), BGP, even ISIS is supported.

Cisco style access-list command and complete support for IPv6 available.


If you're familiar with the Cisco CLI then you'll be completely at home with the Quagga interface. Commands are identitical in most cases. In fact, its a fairly good way of revising for CCNP BSCI, and other Cisco certification exams if you don't have access to actual equipment or simulation software.



Although I haven't attempted to use this apparently 'SMUX' allows you to reference information from the various Quagga services by providing a bridge between SNMP and Quagga. A more apt description would be that, "SMUX is the snmp multiplexing protocol (RFC 1227). It can be used by an snmp agent to query variables maintained by another user-level process."

While I was doing all this I decided to 'clean up' a number of other anomalies on my system. For example, 'kdump' was not starting on boot. Obvious solution would be to simply remove kdump package but I just wanted to make it work. Thought it was just a non-configuration issue. Went through the uncommenting of the default options but it still wasn't starting. Went through the relevant 'init' file. Noticed that it required a 'crashkernel' parameter (eg. crashkernel=128M@128M) in order for it to work (/proc/cmdline contains kernel boot parameters if you're curious. While other /proc files can be written to this file was not responsive to chmod and being written to even if you are root). Ultimately, only way to test is to modify kernel boot parameters via /boot/grub/menu.1st. However, then noticed that kernel wasn't configured with this option available. Hence, boot was not possible. Had to update kernel and kernel source (required for my work with other packages). Thereafter, kdump startup was now possible (if you're curious "echo 1 > /proc/sys/kernel/sysrq", "echo c > /proc/sysrq-trigger" can be used to create a kernel crash situation if you're curious).

yum update kernel
yum update kernel-headers
yum update kernel-devel

Once again, noticed that performance when MTU is 1392 is much better than 1500 during download of packages via yum. Upgrade of kernel  'broke' my VirtualBox setup though. Obviously, suspected kernel module issues so went to /usr/src/vbox_host-* and had to re-run 'make' to re-recompile the relevant kernel modules and re-register relevant modules. Noticed later on that there was an option for the vboxdrv init file (setup) which was 'more correct' though. Used this and VirtualBox startup was all good again. It was a bit easier with VMWare Server though. Kernel modules are automatically re-compiled/re-registered and setup on startup.

Following is useful if you're interested in more about about kernel dump analysis,


While completing all of the above, I remembered previous work regarding PKI certificates to setup OpenVPN. Looked for mkcert.sh/CA.pl and found but also noticed tinyca2 and openvas-mkcert. tinyca2 is a very simple GUI based application for managing certificates while openvas-mkcert is CLI based. Haven't tried using these certificates with OpenVPN as yet though. Will experiment another time.


http://en.wikipedia.org/wiki/List_of_free_and_open_source_software_packages

- as usual thanks to all of the individuals and groups who purchase and use my goods and services
http://sites.google.com/site/dtbnguyen/
http://dtbnguyen.blogspot.com.au/

Friday, December 9, 2011

WINEing and DOSing on Linux

I recently picked up some old (but still very enjoyable games even though their graphics mightn't be up to todays standards). These games included RAC Rally Championship (DOS), Theme Hospital (Windows 95), and Rise of Nations (Windows XP). This post will go through the steps required to get them working under Linux. Note, that I'm assuming that you have installation discs for all relevant programs.

FIrst game I tried was  RAC Rally Championship. I consider it to be one of the better rally arcade/simulator games prior to Colin McCrae's Rally and DIRT. To get it running first install, 'Dosbox' (its a DOS-emulator for various different platforms). Startup the 'dosbox' environment from the Linux CLI or via the relevant GUI menu option. Run the following commands. Then run the following series of commands.


- mount c /media/disk -t cdrom (mount your installation disc to C:)
- mount d ~/rally (mount a folder from your Linux home folder called rally to D: to give the installer a place to place its files. Assumes you have already created the folder of course)

- C: (go to root directory C:)
- install.exe (run the installer file. When prompted install it to D:\)
- cd d:\rally (when installation has completed change to the installation directory)
- ral.exe (run the game)

One thing you'll definitely notice is that while the games are still quite enjoyable they'll often feel quite 'eerie'. Imagine existing in the world of 'True Colour' (32-bit and millions of colours) and then suddenly finding yourself being transported to a world where only 256 colours exist.

 
The next game we'll attempt to setup/run is, 'Theme Hospital'. Similar in concept to 'Zoo Tycoon' it has you running running a Hospital, attempting to find a happy medium between healing patients and fulfilling more tangible/financial goals. I discovered that while installation was quite easy (insert the disc, mount it, then run the setup/installation file via 'wine'), getting everything running perfectly wasn't.



First I needed to get DirectX installed. Instructions are available online for this which involves downloading a redistributable installation file from the Microsoft Download Centre. I opted to use the my Tiger Woods 08 DVD to get DirectX the installation file. While the game worked, sound was not functional. Using 'winecfg' indicated that I was getting 'write' errors to the sound device file and testing obviously produced no sound.



I read up online that a lot of others were having issues with the 'PulseAudio' sound daemon. I also read that removing the package/s could result in 'odd' issues with their desktop installation. Like others, I discovered that killing the pulseaudio damon doesn't actually stop it because it respawns by default. Altering the relevant pulse/client.conf file should changed this behaviour whether done to the core configuration in /etc or via a users' local .pulse setup but didn't which indicated there must be another point of configuration as well. I disabled it through the 'Gnome Startup Preferences' menu option/s. At that point, the error/s disappeared but switching between the different sound daemons still didn't produce success (ESound, Pulse, OSS, ALSA) even though I had installed all relevant 'wine to sound daemon' plugins. Finally, I shutdown everything (sound related) except the core 'PulseAudio' daemon and finally the test/game sound seemed to work even though others have indicated that using ALSA seems to be the best/easiest solution.


While the installer for 'Rise of Nations' went well I've experienced many 'known issues'. Among these, having to use both mouse buttons to navigate through the menu system (though the left mouse button seems to work perfectly fine in the actual game itself), having some regular graphical anomalies (this only occurred with Rise of Nations but not when its Rise of Nations - Thrones and Patriots), a temporary black screen or stall on startup (not a real stall. Use the space bar to reach the main menu), and having no sound. Attempted to switch Direct X libraries between builtin/native as directed elsewhere online to get around these issues but switching to native actually caused an exception to occur on startup of the game so I have since gone back to the program defaults. I also believe that some of these problems may also be version related so I'll update at a later time.



As an inside, I recently scratched one of my game software discs (the game would only run through the install process part of the way before succuming to read errors) but had run out of disc repair solution. I've since discovered that 'toothpaste' actually works quite well as a repair agent since its a light/mild abrasive. It tends to blends the scratch in with its surrounding rather than polishing the disc though.



I recalled the early days of optical drive technology. There were often (and there still are though less drastic) differences in  error detection/correction quality on different drives (I remember a disc that was completely undearable in one drive but perfectly readable in another). I switched from an onboard OptiArc drive to an external Plextor drive and disc reading seemed to be perfect.

I decided it was time for me to make an ISO backup of the disc (allowed under existing law). While the backup disc can be used to install the program, there are mechanisms which prevent it from being used as a startup/game disc (even via emulation software using just the ISO file). There are obviously ways around this type of detection but its always a game of cat and mouse between those who create copyright protection mechanisms and those who attempt to defeat them.


http://www.abtevrythng.com/2011/03/how-to-use-amazon-app-store-outside-us.html

- as usual thanks to all of the individuals and groups who purchase and use my goods and services
http://sites.google.com/site/dtbnguyen/
http://dtbnguyen.blogspot.com.au/

Sane and Sensible Hierarchies/Organisational Structures, Random Stuff, and More

- in this post we'll look at professional sports and other fields and how stuff from that field can be used in the business world and ho...