Você está na página 1de 3

1.

PURPOSE The purposes of this experiment are to show the importance of standardization of solutions to be used in environmental chemical work, to demonstrate the use of burettes and calibration and to display the effects of personnel errors in the use of standard solutions. 2. PROCEDURE 1. We dilute about 1.1 mL of concentrated H2SO4 to 1 liter with distilled water. (This will give 0.04 N H2SO4) 2. We weigh about 0.12 g (to the nearest 0.1 mg) of HgO and add 25 mL of distilled water to it. 3. Together with warming and stirring add solid KI, a little at a time, until the oxide has completely dissolved. 4. We dilute the mixture to 100 mL very precisely. 5. We fill a calibrated burette with H2SO4. 6. We titrate the standard solution with the acid in the presence of methyl-red indicator (it will turn from yellow to red). 3. THEORY A standard solution (or standard titrant) is a reagent of known concentration that is used to carry out a volumetric analysis. Standards solutions play a central role in all volumetric methods of analysis. The ideal standard solution for a volumetric method will 1. be sufficiently stable so that it is necessary to determine its concentration only once. 2. react rapidly with the analyte so that the time required between additions of titrant is minimized. 3. react more or less completely with the analyte so that satisfactory end points are realized. 4. undergo a selective reaction with the analyte that can be described by a simple balanced equation. A titration is performed by slowly adding a standard solution from a buret or other volumetric measuring device to a solution of analyte until the reaction between the two is complete. The volume needed to complete the titration is determined from the difference between the initial and final buret readings. The equivalence point in a titration is reached when the amount of added titrant is chemically equivalent to the amount of analyte in the sample. The equivalence point of a titration is theoretical point that cannot be determined experimentally. Instead, we can only estimate it by observing some physical change associated with the condition of equivalence. This change is called the end point for the titration. The difference in volume between the equivalence point and the end point is the titration error.

With the exception of absolute methods of analysis that involve chemical reactions of known stoichiometry (e.g. Gravimetric and titrimetric determinations), a calibration or standardization procedure is required to establish the relation between a measured physicochemical response to an analyte producing the response. In other words, standardization is the name given to this process of accurately determining the concentration of a standard solution. A primary standard is a highly purified compound that serves as a reference material in all volumetric titrimetric methods. Important requirements for a primary standard are: 1. High purity. Established methods for confirming purity should be available. 2. Stability in air. 3. Absence of hydrate water so that the composition does not change with variations in relative humidity. 4. Ready availability at modest cost. 5. Reasonable solubility in the titration medium. 6. Reasonably large formula weight so that the relative error associated with weighing is minimized. HgO is a primary standard for H2SO4.A weighed sample is dissolved in a solution of KI as given by below formula. HgO + 4I- + H2O HgI42- + 2OHThe resulting alkaline solution is titrated with the acid. Methly-red (or any other indicator changing within a pH range 4.5-9.5) may be used as the indicator. 4. DATA ANALYSIS AND CALCULATIONS 1. The exact concentration of acid solution: Weight of HgO=0.114 g H2SO4 used in titration (mL) =30.8 mL Normality=Accurate wt. of HgO g /meq.wt. of HgO*volume of H2SO4 mL Normality=0.114g / 0.108305*30.8 mL Normality of H2SO4=0.034 N Where, Miliequivalent weight of HgO=molecular weight of HgO /2*1000 Miliequivalent weight of HgO=216.61 / 2000 =0.108305 2. Sulfuric acid 98% density=1.84, M=18.4, Normality=36.8 to make 1000 mL solution Normality=1.1 mL *36.8 / 1000 mL =0.040 N www.erowid.org/archive/rhodium/chemistry /equipment/molarity.html ) (from

Error 1: (0.04-0.034) /0.04*100 = 15% error Error 2: (0.04-0.040) /0.04*100 = 0% error

5. DISCUSSION AND CONCULUSIONS 1. Standard solutions are also commonly used to determine the concentration of an analyte species. By comparing the absorbance of the sample solution at a specific wavelength to a series of standard solutions at differing known concentrations of the analyte species, the concentration of the sample solution can be found via Beers Law. Any form of spectroscopy can be used in this way so long as the analyte species has substantial absorbance in the spectra. The standard solution is a reference guide to discover the molarity of unknown species. 2. A solution of acid can be standardized by titrating it against a solution of alkali (NaOH) of known concentration. Once this has been calculated, it can in turn be used as a standard solution to find the concentration of a solution of alkali. 3. Probably, we used more than 25 mL of distilled water. Maybe, we weighed less than 0.12 g HgO. Perhaps, we did not stir the mixture. 6. REFERENCES 1. Skoog, D.A., West, D.M., Fundamentals of Analytical Chemistry 6th edition, Holt, Rinehart and Winston, Inc.,2004 pp. 94-97 2. Rodojevic, M., Bashcin, V.N., The Royal Society of Chemistry, Cambridge, 1999 p.21 3. Kealey, D., Haires, P.J., Instant Notes Analytical Chemistry, Bios Scientific Publishers Limited, Oxford, 2002 p.15 4. Freiser, H., Nancollas, G., Compendium of Analytical Nomenclature: Definitive Rules, Blackwell Scientific Publications, Oxford, 1987 p.48

Você também pode gostar