Monday, 17 October 2011

Microsoft Moves to Lock Out Linux

Microsoft has announced plans for a security feature in Windows 8 that has the potential to prevent users from installing their own operating systems including Linux.

Early versions of Windows 8 do away with the established BIOS > bootloader > OS booting paradigm in favor of something called the Unified Extensible Firmware Interface (UEFI). This booting system has enabled Microsoft to demonstrate machines booting Windows 8 in a respectable-even-for-Linux eight seconds, but it's the security options built into UEFI that are the real difference.

UEFI includes a secure boot protocol designed to stop bootloader attacks, where rootkits or other malware are loaded into the operating system at boot time. Any code loaded at boot time has to be identified with a secure key, enabling UEFI to lock out unauthorized code. Original Equipment Manufacturers (OEMs) will have to implement this feature in order to get their products Windows 8-certified.

The implications of this for Linux users are obvious. As there is no central registry of keys, it will be up to the PC vendor to determine which code is and isn't signed, giving the manufacturers unprecedented control over what is installed on your machine.

Predictably, that's not how Microsoft program manager Tony Mangefeste sees it. In a blog post he said: "At the end of the day, the customer is in control of their PC. The security that UEFI has to offer with secure boot means that most customers will have their systems protected against bootloader attacks. For the enthusiast who wants to run older operating systems, the option is there to allow you to make that decision."

Quite apart from disingenuously branding Linux an "older" operating system. Mangefeste contradicts his "the customer is in control" sentiment later in the same blog: "Microsoft supports OEMs having the flexibility to decide who manages security certificates and how to allow customers to import and manage those certificates, and manage secure boot."

Matthew Garrett, a mobile Linux developer at Red Hat, hit the nail on the head with his response: "There's no indication that Microsoft will prevent vendors from providing firmware support for disabling this feature and running unsigned code. However, experience indicates that many firmware vendors and OEMs are interested in providing only the minimum of firmware functionality required for their market. It's almost certainly the case that some systems will ship with the option of disabling this. Equally, it's almost certainly the case that some systems won't.

"It's probably not worth panicking yet. But its worth being concerned."
Read rest of entry

What Is Cloud Computing ?


It's the BUZZWORD on everybody's lips but what does cloud computing actually mean? It's not been an easy term to define and there have been many different attempts to explain what the term means. Cloud companies have been prone, like Alice's White Knight, to define the term in a way that they want it to mean.

In some ways it's strange that the term has been so slippery. Millions of us are happy to use such cloud-based services as Facebook, Gmail and Twitter, thinking nothing of it, yet pinning down an exact definition has been as elusive as grabbing a cloud itself.

In an attempt to put a stop to these vagaries, the US National Institute of Standards and Technology put forward a definition that has now become widely accepted as the closest that the industry has to a definitive answer. The NIST definition is as follows.

"Cloud computing is a model for enabling, convenient, on-demand network access to a shared pool of configurable computing resources ( eg. networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model promotes availability and is composed of five essential characteristics, three service models, and four deployment models."

The service models are types of offering, such as software-as-a-service (SaaS), and deployment choices include public and private clouds. But the key characteristics of cloud from a customer's point of view are:

    Self-provisioning so a customer can provision facilities without any human interaction
    Delivery of services over a network
    Ability to be accessed by a variety of devices, not just PCs but also by netbooks, tablet computers and smartphones
    Rapid 'elasticity' - the ability to scale up or scale down computing resources.

From a cloud provider's point of view, a major element of the process is the pooling of computing resources to serve multiple consumers, using what's called a multi-tenant model whereby cloud services are provided to customers as and when they're needed. One of the important factors for cloud service providers is to be able to measure usage accurately and, even more importantly, to bill accurately.

Security concerns

The factor in cloud services that makes most users nervous is the level of security within a multi-tenant model. This is a major concern. Customers are entrusting some of their sensitive data to a third party and there is, of course, nothing stopping one of their major competitors going to the same cloud provider for a service.

Service providers believe that this concern can be easily dealt with: they've generally had a long history of keeping customers data safe and have levels of security that far exceed those of their customers. Take Amazon, one of the leading lights in cloud technology. Millions of us around the world are willing to entrust our personal details and credit cards to that company believing they'll be held safely- why should trusting the company's cloud division, Amazon Web Services, be any different?

In many ways, a more important consideration than security is the location of the data. This is for two reasons. First, there is the inherent latency within the system: the further away the data is stored the longer the lag in accessing it. This is becoming less of a problem as network connection get faster but it still can be a factor.

The second problem is a more serious one, particularly on this side of the Atlantic. There are various EU regulations on where data can be stored - personal data cannot be held outside the EU (within the EU itself, individual countries have stricter guidelines still). This has been a problem for some cloud providers, as part of the appeal is that unused resources at one data center can be used by another. If data centers outside the EU cannot store European customers' data, providers have to be careful in marshaling their resources.

Allied to this is a secondary problem: the US Patriot Act, which compels US, companies to hand over personal data held on their servers if requested by US authorities. As this applies to European data held on servers located in Europe, this has made some European customers rather nervous. The implications of the Patriot Act are still being worked through.

Virtual world

There are other elements within cloud computing. Virtualization is another key concept. It means what it says, the use of virtual resources instead of physical ones. For example, a server within a data center may be operating at just 15% of its capacity (this used to be a typical usage); virtualization is a technique where the resources that aren't being used by the server for the application that it's driving (database, website or whatever) can be used for something else- driving usage rates up. Virtualization will often go hand-in-hand with server consolidation so it helps to reduce the number of servers within a data center.

Like cloud computing, it's an old concept, originating from the mainframe world and only becoming widely used after VMware, a virtualization specialist, started applying it to servers. The technology has now been adopted nearly universally within enterprises and the technique of re-allocating resources has made it vital for the development of the cloud.

We've spoken a lot about cloud service providers but another important part of the cloud is the delivery of software the so-called software-as-a-service (SaaS) delivery mechanism. This is a technique that was really pioneered by Salesforce.com with its hosted CRM product but has since been adopted by countless other companies. SaaS delivery helps solve various problems within an enterprise: over-provisioning, security updates and licensing among them, and is widely seen now as the dominant method for providing software.

As a concept, cloud computing has grown quickly and is set to penetrate deeper into the market. According to an oft-cited Gartner report, 20% of enterprises will have no IT departments by the end of 2012. While that looks to be a bit optimistic (or pessimistic depending on your view), the impetus is clearly with cloud. It's a technology that's here to stay.


Difference between outsourcing and cloud computing

Outsourcing is widely known and used in technology circles. It's when a third party performs an IT function or other service on behalf of its customer. Outsourcing can be employed for a variety of reasons - lack of expertise in-house, lack of personnel or because the resources are needed purely for an individual project.

The key differences with cloud are defined by the underlying technology of the cloud provider. Essential to this is the use of virtualization -all cloud providers make use of virtualization technology - and automation (the ideal cloud service has little human intervention). The other key element of cloud computing is the use of self-provisioning- one of the major benefits is the ability to make a business more agile and flexible because services can be turned up and down at will.

A brief history of cloud computing

Anyone hanging around cloud vendors for any amount of time will hear one often repeated mantra - "Cloud computing is not new you know, cloud has been around for some time" - generally from a veteran of the technology industry. There's an element of truth in this but, at the same time, it's spectacularly missing the point. It's possible to point to a 1966 book by Douglas Parkhill, The Challenge of the Computer Utility, tor the origins of cloud computing. In that book, Parkhill detailed many of the elements of cloud computing- elastic provision, online delivery, perception of infinite supply it’s just taken a while for the theory to become reality.

Saying that the theories espoused in Parkhill's book are the first elements of cloud computing is a bit like saying that Leonardo Da Vinci's notebooks are the blueprints for the first helicopter. It's one thing coming forward with the theory; it's quite another delivering in practice. There have been plenty of false dawns before cloud computing became the beast it has become. We've seen it described as grid computing, computing on-demand and utility computing before the phrase cloud computing took hold. It's only been widely used since late 2007, although the term was first used in a lecture by computer scientist Ramnath Chellappa.

For cloud computing to become a reality, there were other changes needed first. Most important of these was the availability of fast and cheap broadband - the early attempts at cloud computing all foundered because of the dearth of such a service. Then virtualization needed to become more widespread, as this technology is the bedrock of cloud computing.

Other factors are the declining cost in storage, the availability of cheaper devices to access cloud services and the development of automatic provisioning software.
Read rest of entry

Sunday, 16 October 2011

Installing Mac OS X on a Normal PC / Non Apple PC


The news is not new, but nowadays a lot of tutorials about how to install Apple’s OS X on an x86 computer are available, and the procedure is now followable by (almost) any PC user.

I want to remember to everybody that it’s illegal to install and use OS X without a valid licence. I take no responsability for any invalid use of this tutorial.
In addiction, please take note that the following procedure may cause a data loss, so do only if you know what are you doing.

We’ll see how to install OS X on a Windows XP machine, on a 7Gb dedicated partition. So you have to prepare your hard disk with:

- The first, primary and bootable partition with a normal Win XP installation;
- A second empty NTFS primary partition;
- Any other partition you like.

Please refer to other documentations if you don’t know how to deal with hard drive partitions. You can use, for example, Partition Magic to change your HD partitions… anyway if you are not experienced in partitioning, my tip is stop here.

Then, you have to get some software:

1. An Ubuntu Linux Live CD: you can get it, or you can download it. It’s free. A valid alternative to Ubuntu that I found is Knoppix STD: use it if you’ve problems with Ubuntu;
2. VmWare Workstation it’s a commercial application, but a free trial is available;
3. The Deadmoo’s OS X Linux distro: this is available on P2P networks, usually under tiger-x86.tar.bz2 filename. I want to remember again that you’ve no right to download and to use it, if you don’t have a valid Apple’s OS X Licence.

- Boot from Ubuntu (of Knoppix) Live CD;
- Open a Terminal Window, get administrative rights (it should be sufficient to enter the command “sudo su“)… and type:
cfdisk /dev/hda
- The CFdisk utility will start: choose the partition where you want to install OS X (it should be the second one, but please make attention!) and choose “TYPE”. Now write “AF” (without quotes, of course) as type, and confirm changes choosing “WRITE”. The partition you chose will be erased and it’s type will be set to AF (Apple Format).

Now reboot, go back to Windows XP, install and open VmWare.
Create a new “FreeBSD” virtual machine. When asked to choose the primary hard drive for the virtual machine, choose your phisical hard disk; Also add a second hard drive to the virtual machine: the virtual drive inside Deadmoo’s archive; insert your Ubuntu CD and make sure that your phisical DVD/CD-rom reader is enabled in the virtual machine.
Then start the virtual machine and press on ESC as the machine begin, to enter boot menu: choose to boot from CD. Ubuntu should start booting in the virtual machine’s window. On the contrary, if you see Windows starting, immediately turn off the virtual machine and check CD-Rom settings (in this case, the CD boot has not started and your Windows XP is going to start into itself!).

When Ubuntu is ready, open a terminal window and type a command like this:

dd if=/dev/hdb1 of=/dev/hda2/ bs=8192
hdb (2nd hard drive) should be the mounted Deadmoo’s image: our source;
hda2 should be the 2nd partition of the 1st hard drive (our phisical one): the destination.
CHANGE HDB1 and HDA2 with YOUR SETTINGS!

If you’re not sure about partitions, type:
fdisk /dev/hda -l
to get a list of your connected hard drives and relative partitions identifiers.

The process will take about 5-10 minutes. At the end, you can shut down the virtual machine.

Download this file.
It contains a file named chain0. Extract it to the root of your C:\ partition and add the following line:
C:\chain0=”Mac OS X”
to your C:\boot.ini file.

Now reboot your PC and choose Mac OS X at the boot list screen. Then the “Darwin boot” will ask you to select the partition with Mac OS X installed: select it with arrow keys.

Now try to type -s and then enter. If everything goes ok, at the prompt type:
sh /etc/rc
passwd curtis
passwd root
and enter your new password when asked. If anything is wrong, try to boot with -x or without arguments.

Please take note that sometimes it’s needed to boot several times (with -s, -x, or with no arguments) to get OS X work… I don’t know the reason, but it happens!

NOTE : This method is not an emulation (like others that run Mac OS X into a virtual machine. Here VmWare is used only to transfer the OS X image to the 2nd partition). Mac OS runs directly on the x86 machine, so speed is the best you can get.

Of course our PCs are not exactly the hardware thought by Apple’s developers, so speed is not the same of an original Macintosh.
Read rest of entry

The Amazing 3MB Operating System


Kolibri desktop is a small 3MB Operating System, which is written entirely in machine code (CPU Language) and is not based on Linux, Windows or any other variants. What is surprising is the number of applications packed into this tiny file size. There are a whole bunch of games, a basic tables editor, a compiler, a text editor, a bouquet of demos, a text based web browser and more. There is also DosBox, which lets you run almost any DOS game on Kolibri. While not meant for serious use, the tiny OS shows how bloated Windows and Linux really are.
Read rest of entry

How to check data usage in BSNL DataOne ?


Many of my friends in India use BSNL DataOne Broad Band connection because it's quite cheap on pocket, but many of them have problem viewing there data usage status.

Here is the link to check it :

http://10.241.0.195/webLogin.jsp

Just navigate to the following link and enter your username and password provided to you. You will be displayed few menu options. Click on "Service Records" and check the data usage of the required month.

Note : Use Internet Explorer as the browser and if the link above does not work, then reset all the settings of IE and clear all cookies and history and restart the browser.
Read rest of entry

Quick tips to drive traffic to your blog or website

If you started a blog and you want to make some future,you need to concentrate on your traffic in the start,rather than getting your friends to click on the ads.But how to make people come to your website,read it and come again.Analysing and detailing it will take almost a year.Here by im posting some quick tips for the new bloggers.

    Content :You post even just one post a day,it should have content which has quality.Its better to have single good post than 10 bad ones.viewers always appreciate the pages with good content.You should make your best effort in getting them back to your site again.Make your posts conversational, pithy and topical. Keep them short and stick to one topic per post.Write often and regularly so that both readers and search engines visit your blog more often.

    Search Engines : One of the best sources of traffic if you make your site well organised.Come up with nice keywording and you will get good page ranking .The more is your page rank the more the ad provide want to pay to you.Make sure your blog URL has got very good keyword in its title.Use submission sites to submit your URL in all the search engines like google,yahoo..Dont forget to submit your feed in RSS directories.There are sites which has submission directory links in bulk.Go there and submit your feed URL.use feedstats to attract users like the feedburner chicklet i have in my subscribe box.

    Signature : Post your blog address in the signature in your email making the users know your presence in the web.Go to each and every place where you find a lot of surfers like forums,social networks,community discussion and write your link over there.Write your link in the comments of the most famous blogs.But remember that Google dont count this as a linkback to your site.

    Email subscription : Dont forget to give a link of subscribe to updates through email on your blog.

    Social bookmarking: Add a link of submit my site to digg,redditt,blinklist,buzz it and ....at the end of each article your write like i have on my blog.Submit your articles to these sites to get best traffic.

    Backlinking : Try to contact authors of good blogs and request them for backlinks.It should nt look to them that you are begging,Write a letter like this"I have something interesting for you on my webpage.If you find my site or blog interesting,give me a link o my site on your webpage"


I hope these tips might had helped you to a good extent. Don't forget to post your comments and any other additions
Read rest of entry

Robots.txt : The most powerful text file on web


"Robots.txt" : You must have heard the name of this text file, if you are a web developer or you have got a little knowledge of search engine optimization. And if you don't know about this, one of the most powerful text files, then read on...

Robots.txt is a regular text file, as its extension says, which directs the search engine robots and crawlers while crawling your web pages.
It has got a piece of code in a special format, which actually is a set of rules for the web crawlers.

The most basic robots.txt file has the following code within it...

User-agent: *
Allow: /

This file allows all the crawlers to crawl your web page.

There can be many other methods to declare a "robots.txt" file.
You can allows some bots and disallow others.

The Format of Robots.txt

The file consists of one or more records separated by one or
more blank lines.

The record starts with one or more User-agent lines, followed by one or more Disallow lines,
as detailed below. Unrecognized headers are ignored.

User-agent

The value of this field is the name of the robot the record is describing access policy for.
If more than one User-agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record.

If the value is '*', the record describes the default access policy for any robot that has not matched any of the other records. It is not allowed to have multiple such records in the "/robots.txt" file.

Disallow

The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved.
For example, Disallow: /help disallows both /help.html and /help/index.html, whereas Disallow: /help/ would disallow /help/index.html but allow /help.html.

Any empty value, indicates that all URLs can be retrieved.

So if you want your webpages to be crawled and indexed in the search engines in a way you like, then go and generate a robots.txt file for youe website.
Read rest of entry
 

Popular Posts

Gadget Statistics Copyright (c) Gizmo Corporation . All rights are reserved by Piyush Arora and "Gadgets Statistics"