วันศุกร์ที่ 27 มิถุนายน พ.ศ. 2551

Reseller web hosting

Reseller hosting is a form of web hosting wherein the account owner has the ability to use his/her allotted hard drive space and bandwidth to host websites on behalf of third parties. The reseller purchases the host's services wholesale and then sells them to his customers for a profit. The certain portion of hard drive and bandwidth is allocated to reseller account. In order to achieve this the reseller may rent a dedicated server from a hosting company or resell shared hosting services. If the latter is the case the reseller is simply given the permission to sell a certain amount of disk space and bandwidth to his own customers without renting a server from a web hosting company he signed for a reseller account with.
The typical web hosting reseller might be a web design firm, web developer or systems integrator who offers web hosting as an add-on service. Reseller hosting is also an inexpensive way for web hosting entrepreneurs to start a company. Most reseller hosting plans allow resellers to create their own service plans and choose their own pricing structure. In many cases, resellers are able to establish their own branding via customized control panels and name servers.
Reseller hosting does not require extensive knowledge of the technical aspects of web hosting. Usually, the data center operator is responsible for maintaining network infrastructure and hardware, and the dedicated server owner configures/secures/updates the server. A reseller is responsible for interfacing with his/her own customer base, but any hardware, software and connectivity problems are typically forwarded to the server provider from whom the reseller plan was purchased.
Through point and click "Control Panels" (as listed below), resellers can set up and manage customer accounts via a web interface. In addition, the ModernBill software is popular among resellers, as it automates account creation and billing. Most of the reseller hosting companies offers different reseller hosting plans. For running the web hosting company control panel is must, we can name it as cpwebhosting(control panel webhosting).

One-click hosting

One-click hosting generally describes web services that allows internet users to easily upload one or more files from their hard drives onto the one-click host's server free of charge.
Most such services simply return a URL which can be given to other people, who can then fetch the file later on. As of 2005 these sites have drastically increased in popularity, and subsequently, many of the smaller, less efficient sites have failed. Many internet forums exist in order to share such links; this type of file sharing has, to a degree, taken over from P2P filesharing services[1].
The sites make money through advertising or charging for premium services such as increased downloading capacity, removing any wait restrictions the site may have or prolonging how long uploaded files remain on the site. Many sites implement a CAPTCHA to prevent automated downloading.

Dedicated hosting service

A dedicated hosting service, dedicated server, or managed hosting service is a type of Internet hosting where the client leases an entire server not shared with anyone. This is more flexible than shared hosting, as organizations have full control over the server(s), including choice of operating system, hardware, etc. Server administration can usually be provided by the hosting company as an add-on service. In some cases a dedicated server can offer less overhead and a larger return on investment. Dedicated servers are most often housed in data centers, similar to colocation facilities, providing redundant power sources and HVAC systems. In contrast to colocation, the server hardware is owned by the provider and in some cases they will provide support for your operating system or applications.

Self-hosting

Self-hosting refers to the use of a computer program as part of the toolchain or operating system that produces new versions of that same program—for example, a compiler that can compile its own source code. Self-hosting software is commonplace on personal computers and larger systems. Other programs that are typically self-hosting include kernels, assemblers, and shells.
If a system is so new that no software has been written for it, then software is developed on another self-hosting system and placed on a storage device that the new system can read. Development continues this way until the new system can reliably host its own development. Development of the Linux operating system, for example, was initially hosted on a Minix system. Writing new software development tools "from the metal" (that is, without using another host system) is rare and in many cases impossible.
Several programming languages are self-hosting, in the sense that a compiler for the language, written in the same language, is available. The first compiler for a new programming language can be written in another language (in rare cases, machine language) or produced using bootstrapping. Self-hosting languages include Lisp, Forth, Pascal, Delphi, C, Modula-2, Oberon, Smalltalk, OCaml, and FreeBASIC.

History
The first self-hosting compiler (excluding assemblers) was written for Lisp by Hart and Levin at MIT in 1962. Because Lisp interpreters existed previously, but no Lisp compilers, they used an original method to compile their compiler. The compiler, like any other Lisp program, could be run in a Lisp interpreter. So they simply ran the compiler in the interpreter, giving it its own source code to compile.[1]
The compiler as it exists on the standard compiler tape is a machine language program that was obtained by having the S-expression definition of the compiler work on itself through the interpreter. (AI Memo 39)[1]
This technique is only possible when an interpreter already exists for the very same language that is to be compiled. It borrows directly from the notion of running a program on itself as input, which is also used in various proofs in theoretical computer science, such as the proof that the halting problem is undecidable.

Free web hosting service

A free web hosting service is a web hosting service that is free, usually advertisement-supported and of limited functionality, though not at all times. Free web hosts will usually provide a subdomain (yoursite.example.com) or a directory (www.example.com/~yourname). In contrast, paid web hosts will usually provide a second-level domain along with the hosting (www.yourname.com). Many free hosts do allow use of separately-purchased domains. Rarely, a free host may also operate as a domain name registrar.

Features and limitations
Only a few free web hosts offer basic package for free and enhanced packages (with more features) for a cost. This allows users to try the service for an initial trial (see how it performs compared to other hosts), and then upgrade when (and if) needed.
Free hosting may have the following limitations:
Limitation on the size of each hosted file
Very small bandwidth per month compared to paid hosting
Disabling on hotlinking of files
File type restrictions (for example MP3, MPEG, ZIP etc.)
Compulsory placement of the Webhosts' Banner or Popup ads into all web pages
No provided uptime guarantee
No allowance of custom URLS, such as "http://www.domain.com". It has to be "http://www.provider.com/domain" or "http://domain.provider.com/".
Some free host may provide these extra features:
A web based control panel
Free email accounts for the domain or subdomain hosted
File transfer via FTP
Scripting languages: PHP, ASP, Perl etc.
Relational databases such as MySQL
Scheduled processes, known as cronjobs
Other features such as guestbooks
Forums and community resources not typical of paid hosts
Reward systems which provide extra free products and services
Have no data limitations offering unlimited space

Data center

A data center is a facility used to house computer systems and associated components, such as telecommunications and storage systems. It generally includes redundant or backup power supplies, redundant data communications connections, environmental controls (e.g., air conditioning, fire suppression), and special security devices.

History
Data centers have their roots in the huge computer rooms of the early ages of the computing industry. Early computer systems were complex to operate and maintain, and required a special environment in which to operate. Many cables were necessary to connect all the components, and methods to accommodate and organize these were devised, such as standard racks to mount equipment, elevated floors, and cable trays (installed overhead or under the elevated floor). Also, old computers required a great deal of power, and had to be cooled to avoid overheating. Security was important – computers were expensive, and were often used for military purposes. Basic design guidelines for controlling access to the computer room were therefore devised.
During the boom of the microcomputer industry, and especially during the 1980s, computers started to be deployed everywhere, in many cases with little or no care about operating requirements. However, as information technology (IT) operations started to grow in complexity, companies grew aware of the need to control IT resources. With the advent of client-server computing, during the decade of 1990, microcomputers (now called "servers") started to find their places on the old computer rooms. The availability of inexpensive networking equipment, coupled with new standards for network cabling, made it possible to use a hierarchical design which put the servers in a specific room inside the company. The use of the term "data center", as applied to specially design computer rooms, started to gain popular recognition about this time.
The boom of data centers came during the dot-com bubble. Companies needed fast Internet connectivity and non-stop operation to deploy systems and establish a presence on the Internet. Installing such equipment was not viable for many smaller companies. Many companies started building very large facilities, called "Internet data centers", or IDCs, which provide businesses with a range of solutions for systems deployment and operation. New technologies and practices were designed to handle the scale and the operational requirements of such large scale operations. These practices eventually migrated towards the private data centers, and were largely adopted because of their practical results.
As of 2007, data center design, construction, and operation is a well-known discipline. Standard documents from accredited professional groups, such as the Telecommunications Industry Association, specify the requirements for data center design. Well-known operational metrics for data center availability can be used to evaluate the business impact of a disruption. There is still a lot of development being done in operation practice, and also in environmentally-friendly data center design.

DNS hosting service

A DNS hosting service is a service that runs Domain Name System servers. Most, but not all, domain name registrars include DNS hosting service with registration. Free DNS hosting services also exist. Almost all DNS hosting services are "shared"; except for the most popular Internet sites, there is no need to dedicate a server to hosting DNS for a single website. Many third-party DNS hosting services provide Dynamic DNS.
DNS hosting service is better when the provider has multiple servers in various geographic locations that minimize latency for clients around the world.
DNS can also be self-hosted by running DNS software on generic Internet hosting service

Free DNS
A number of sites offer free DNS hosting, either for second level domains registered with registrars which do not offer free (or sufficiently flexible) DNS service, or as third level domains (selection.somedomain.com). These services generally also offer Dynamic DNS. In many cases the free services can be upgraded with various premium services.