WebGL is a JavaScript API that adds native support for rendering 3D graphics within compatible web browsers, through an API similar to OpenGL. How are 3D models ...
DevelopersHiring?ToptalhandpickstopJavaScriptdeveloperstosuityourneeds.Top3%WhyClientsEnterpriseCommunityBlogAboutUsFollowusonLogInGotoYourProfileEngineeringAllBlogsIconChevronIconCloseSearchFilterbyAllEngineeringDesignFinanceProjectsProductToptalInsightsViewallresultsEngineeringDesignFinanceProjectsProductToptalInsightsTechnology25+minuteread3DGraphics:AWebGLTutorialWhetheryoujustwanttocreateaninteractive3Dlogo,onthescreenordesignafullyfledgedgame,knowingtheprinciplesof3Dgraphicsrenderingwillhelpyouachieveyourgoal.
Inthisarticle,ToptalFreelanceSoftwareEngineerAdnanAdemovicgivesusastep-by-steptutorialtorenderingobjectswithtexturesandlighting,bybreakingdownabstractconceptslikeobjects,lights,andcamerasintosimpleWebGLprocedures.AuthorAuthorAdnanAdemovicAdnanhasexperienceindesktop,embedded,anddistributedsystems.HehasworkedextensivelyinC++,Python,andinotherlanguages.SHARESHAREReadtheSpanishversionofthisarticletranslatedbyMariselaOrdazTheworldof3Dgraphicscanbeveryintimidatingtogetinto.Whetheryoujustwanttocreateaninteractive3Dlogo,ordesignafullyfledgedgame,ifyoudon’tknowtheprinciplesof3Drendering,you’restuckusingalibrarythatabstractsoutalotofthings.
Usingalibrarycanbejusttherighttool,andJavaScripthasanamazingopensourceoneintheformofthree.js.Therearesomedisadvantagestousingpre-madesolutions,though:
Theycanhavemanyfeaturesthatyoudon’tplantouse.Thesizeoftheminifiedbasethree.jsfeaturesisaround500kB,andanyextrafeatures(loadingactualmodelfilesisoneofthem)makethepayloadevenlarger.Transferringthatmuchdatajusttoshowaspinninglogoonyourwebsitewouldbeawaste.
Anextralayerofabstractioncanmakeotherwiseeasymodificationshardtodo.Yourcreativewayofshadinganobjectonthescreencaneitherbestraightforwardtoimplementorrequiretensofhoursofworktoincorporateintothelibrary’sabstractions.
Whilethelibraryisoptimizedverywellinmostscenarios,alotofbellsandwhistlescanbecutoutforyourusecase.Therenderercancausecertainprocedurestorunmillionsoftimesonthegraphicscard.Everyinstructionremovedfromsuchaproceduremeansthataweakergraphicscardcanhandleyourcontentwithoutproblems.
Evenifyoudecidetouseahigh-levelgraphicslibrary,havingbasicknowledgeofthethingsunderthehoodallowsyoutouseitmoreeffectively.Librariescanalsohaveadvancedfeatures,likeShaderMaterialinthree.js.Knowingtheprinciplesofgraphicsrenderingallowsyoutousesuchfeatures.
Ourgoalistogiveashortintroductiontoallthekeyconceptsbehindrendering3DgraphicsandusingWebGLtoimplementthem.Youwillseethemostcommonthingthatisdone,whichisshowingandmoving3Dobjectsinanemptyspace.
Thefinalcodeisavailableforyoutoforkandplayaroundwith.
Representing3DModels
Thefirstthingyouwouldneedtounderstandishow3Dmodelsarerepresented.Amodelismadeofameshoftriangles.Eachtriangleisrepresentedbythreevertices,foreachofthecornersofthetriangle.Therearethreemostcommonpropertiesattachedtovertices.
VertexPosition
Positionisthemostintuitivepropertyofavertex.Itisthepositionin3Dspace,representedbya3Dvectorofcoordinates.Ifyouknowtheexactcoordinatesofthreepointsinspace,youwouldhavealltheinformationyouneedtodrawasimpletrianglebetweenthem.Tomakemodelslookactuallygoodwhenrendered,thereareacouplemorethingsthatneedtobeprovidedtotherenderer.
VertexNormal
Considerthetwomodelsabove.Theyconsistofthesamevertexpositions,yetlooktotallydifferentwhenrendered.Howisthatpossible?
Besidestellingtherendererwherewewantavertextobelocated,wecanalsogiveitahintonhowthesurfaceisslantedinthatexactposition.Thehintisintheformofthenormalofthesurfaceatthatspecificpointonthemodel,representedwitha3Dvector.Thefollowingimageshouldgiveyouamoredescriptivelookathowthatishandled.
Theleftandrightsurfacecorrespondtotheleftandrightballinthepreviousimage,respectively.Theredarrowsrepresentnormalsthatarespecifiedforavertex,whilethebluearrowsrepresenttherenderer’scalculationsofhowthenormalshouldlookforallthepointsbetweenthevertices.Theimageshowsademonstrationfor2Dspace,butthesameprincipleappliesin3D.
Thenormalisahintforhowlightswillilluminatethesurface.Thecloseralightray’sdirectionistothenormal,thebrighterthepointis.Havinggradualchangesinthenormaldirectioncauseslightgradients,whilehavingabruptchangeswithnochangesin-betweencausessurfaceswithconstantilluminationacrossthem,andsuddenchangesinilluminationbetweenthem.
TextureCoordinates
Thelastsignificantpropertyaretexturecoordinates,commonlyreferredtoasUVmapping.Youhaveamodel,andatexturethatyouwanttoapplytoit.Thetexturehasvariousareasonit,representingimagesthatwewanttoapplytodifferentpartsofthemodel.Therehastobeawaytomarkwhichtriangleshouldberepresentedwithwhichpartofthetexture.That’swheretexturemappingcomesin.
Foreachvertex,wemarktwocoordinates,UandV.Thesecoordinatesrepresentapositiononthetexture,withUrepresentingthehorizontalaxis,andVtheverticalaxis.Thevaluesaren’tinpixels,butapercentagepositionwithintheimage.Thebottom-leftcorneroftheimageisrepresentedwithtwozeros,whilethetop-rightisrepresentedwithtwoones.
AtriangleisjustpaintedbytakingtheUVcoordinatesofeachvertexinthetriangle,andapplyingtheimagethatiscapturedbetweenthosecoordinatesonthetexture.
YoucanseeademonstrationofUVmappingontheimageabove.Thesphericalmodelwastaken,andcutintopartsthataresmallenoughtobeflattenedontoa2Dsurface.Theseamswherethecutsweremadearemarkedwiththickerlines.Oneofthepatcheshasbeenhighlighted,soyoucannicelyseehowthingsmatch.Youcanalsoseehowaseamthroughthemiddleofthesmileplacespartsofthemouthintotwodifferentpatches.
Thewireframesaren’tpartofthetexture,butjustoverlayedovertheimagesoyoucanseehowthingsmaptogether.
LoadinganOBJModel
Believeitornot,thisisallyouneedtoknowtocreateyourownsimplemodelloader.TheOBJfileformatissimpleenoughtoimplementaparserinafewlinesofcode.
Thefilelistsvertexpositionsinavformat,withanoptionalfourthfloat,whichwewillignore,tokeepthingssimple.Vertexnormalsarerepresentedsimilarlywithvn.Finally,texturecoordinatesarerepresentedwithvt,withanoptionalthirdfloatwhichweshallignore.Inallthreecases,thefloatsrepresenttherespectivecoordinates.Thesethreepropertiesareaccumulatedinthreearrays.
Facesarerepresentedwithgroupsofvertices.Eachvertexisrepresentedwiththeindexofeachoftheproperties,wherebyindicesstartat1.Therearevariouswaysthisisrepresented,butwewillsticktothefv1/vt1/vn1v2/vt2/vn2v3/vt3/vn3format,requiringallthreepropertiestobeprovided,andlimitingthenumberofverticesperfacetothree.Alloftheselimitationsarebeingdonetokeeptheloaderassimpleaspossible,sinceallotheroptionsrequiresomeextratrivialprocessingbeforetheyareinaformatthatWebGLlikes.
We’veputinalotofrequirementsforourfileloader.Thatmaysoundlimiting,but3DmodelingapplicationstendtogiveyoutheabilitytosetthoselimitationswhenexportingamodelasanOBJfile.
ThefollowingcodeparsesastringrepresentinganOBJfile,andcreatesamodelintheformofanarrayoffaces.
functionGeometry(faces){
this.faces=faces||[]
}
//ParsesanOBJfile,passedasastring
Geometry.parseOBJ=function(src){
varPOSITION=/^v\s+([\d\.\+\-eE]+)\s+([\d\.\+\-eE]+)\s+([\d\.\+\-eE]+)/
varNORMAL=/^vn\s+([\d\.\+\-eE]+)\s+([\d\.\+\-eE]+)\s+([\d\.\+\-eE]+)/
varUV=/^vt\s+([\d\.\+\-eE]+)\s+([\d\.\+\-eE]+)/
varFACE=/^f\s+(-?\d+)\/(-?\d+)\/(-?\d+)\s+(-?\d+)\/(-?\d+)\/(-?\d+)\s+(-?\d+)\/(-?\d+)\/(-?\d+)(?:\s+(-?\d+)\/(-?\d+)\/(-?\d+))?/
lines=src.split('\n')
varpositions=[]
varuvs=[]
varnormals=[]
varfaces=[]
lines.forEach(function(line){
//MatcheachlineofthefileagainstvariousRegEx-es
varresult
if((result=POSITION.exec(line))!=null){
//Addnewvertexposition
positions.push(newVector3(parseFloat(result[1]),parseFloat(result[2]),parseFloat(result[3])))
}elseif((result=NORMAL.exec(line))!=null){
//Addnewvertexnormal
normals.push(newVector3(parseFloat(result[1]),parseFloat(result[2]),parseFloat(result[3])))
}elseif((result=UV.exec(line))!=null){
//Addnewtexturemappingpoint
uvs.push(newVector2(parseFloat(result[1]),1-parseFloat(result[2])))
}elseif((result=FACE.exec(line))!=null){
//Addnewface
varvertices=[]
//Createthreeverticesfromthepassedone-indexedindices
for(vari=1;i<10;i+=3){
varpart=result.slice(i,i+3)
varposition=positions[parseInt(part[0])-1]
varuv=uvs[parseInt(part[1])-1]
varnormal=normals[parseInt(part[2])-1]
vertices.push(newVertex(position,normal,uv))
}
faces.push(newFace(vertices))
}
})
returnnewGeometry(faces)
}
//LoadsanOBJfilefromthegivenURL,andreturnsitasapromise
Geometry.loadOBJ=function(url){
returnnewPromise(function(resolve){
varxhr=newXMLHttpRequest()
xhr.onreadystatechange=function(){
if(xhr.readyState==XMLHttpRequest.DONE){
resolve(Geometry.parseOBJ(xhr.responseText))
}
}
xhr.open('GET',url,true)
xhr.send(null)
})
}
functionFace(vertices){
this.vertices=vertices||[]
}
functionVertex(position,normal,uv){
this.position=position||newVector3()
this.normal=normal||newVector3()
this.uv=uv||newVector2()
}
functionVector3(x,y,z){
this.x=Number(x)||0
this.y=Number(y)||0
this.z=Number(z)||0
}
functionVector2(x,y){
this.x=Number(x)||0
this.y=Number(y)||0
}
TheGeometrystructureholdstheexactdataneededtosendamodeltothegraphicscardtoprocess.Beforeyoudothatthough,you’dprobablywanttohavetheabilitytomovethemodelaroundonthescreen.
PerformingSpatialTransformations
Allthepointsinthemodelweloadedarerelativetoitscoordinatesystem.Ifwewanttotranslate,rotate,andscalethemodel,allweneedtodoisperformthatoperationonitscoordinatesystem.CoordinatesystemA,relativetocoordinatesystemB,isdefinedbythepositionofitscenterasavectorp_ab,andthevectorforeachofitsaxes,x_ab,y_ab,andz_ab,representingthedirectionofthataxis.Soifapointmovesby10onthexaxisofcoordinatesystemA,then—inthecoordinatesystemB—itwillmoveinthedirectionofx_ab,multipliedby10.
Allofthisinformationisstoredinthefollowingmatrixform:
x_ab.xy_ab.xz_ab.xp_ab.x
x_ab.yy_ab.yz_ab.yp_ab.y
x_ab.zy_ab.zz_ab.zp_ab.z
0001
Ifwewanttotransformthe3Dvectorq,wejusthavetomultiplythetransformationmatrixwiththevector:
q.x
q.y
q.z
1
Thiscausesthepointtomovebyq.xalongthenewxaxis,byq.yalongthenewyaxis,andbyq.zalongthenewzaxis.Finallyitcausesthepointtomoveadditionallybythepvector,whichisthereasonwhyweuseaoneasthefinalelementofthemultiplication.
Thebigadvantageofusingthesematricesisthefactthatifwehavemultipletransformationstoperformonthevertex,wecanmergethemintoonetransformationbymultiplyingtheirmatrices,priortotransformingthevertexitself.
Therearevarioustransformationsthatcanbeperformed,andwe’lltakealookatthekeyones.
NoTransformation
Ifnotransformationshappen,thenthepvectorisazerovector,thexvectoris[1,0,0],yis[0,1,0],andzis[0,0,1].Fromnowonwe’llrefertothesevaluesasthedefaultvaluesforthesevectors.Applyingthesevaluesgivesusanidentitymatrix:
1000
0100
0010
0001
Thisisagoodstartingpointforchainingtransformations.
Translation
Whenweperformtranslation,thenallthevectorsexceptforthepvectorhavetheirdefaultvalues.Thisresultsinthefollowingmatrix:
100p.x
010p.y
001p.z
0001
Scaling
Scalingamodelmeansreducingtheamountthateachcoordinatecontributestothepositionofapoint.Thereisnouniformoffsetcausedbyscaling,sothepvectorkeepsitsdefaultvalue.Thedefaultaxisvectorsshouldbemultipliedbytheirrespectivescalingfactors,whichresultsinthefollowingmatrix:
s_x000
0s_y00
00s_z0
0001
Heres_x,s_y,ands_zrepresentthescalingappliedtoeachaxis.
Rotation
TheimageaboveshowswhathappenswhenwerotatethecoordinateframearoundtheZaxis.
Rotationresultsinnouniformoffset,sothepvectorkeepsitsdefaultvalue.Nowthingsgetabittrickier.Rotationscausemovementalongacertainaxisintheoriginalcoordinatesystemtomoveinadifferentdirection.Soifwerotateacoordinatesystemby45degreesaroundtheZaxis,movingalongthexaxisoftheoriginalcoordinatesystemcausesmovementinadiagonaldirectionbetweenthexandyaxisinthenewcoordinatesystem.
Tokeepthingssimple,we’lljustshowyouhowthetransformationmatriceslookforrotationsaroundthemainaxes.
AroundX:
1000
0cos(phi)sin(phi)0
0-sin(phi)cos(phi)0
0001
AroundY:
cos(phi)0sin(phi)0
0100
-sin(phi)0cos(phi)0
0001
AroundZ:
cos(phi)-sin(phi)00
sin(phi)cos(phi)00
0010
0001
Implementation
Allofthiscanbeimplementedasaclassthatstores16numbers,storingmatricesinacolumn-majororder.
functionTransformation(){
//Createanidentitytransformation
this.fields=[1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1]
}
//Multiplymatrices,tochaintransformations
Transformation.prototype.mult=function(t){
varoutput=newTransformation()
for(varrow=0;row<4;++row){
for(varcol=0;col<4;++col){
varsum=0
for(vark=0;k<4;++k){
sum+=this.fields[k*4+row]*t.fields[col*4+k]
}
output.fields[col*4+row]=sum
}
}
returnoutput
}
//Multiplybytranslationmatrix
Transformation.prototype.translate=function(x,y,z){
varmat=newTransformation()
mat.fields[12]=Number(x)||0
mat.fields[13]=Number(y)||0
mat.fields[14]=Number(z)||0
returnthis.mult(mat)
}
//Multiplybyscalingmatrix
Transformation.prototype.scale=function(x,y,z){
varmat=newTransformation()
mat.fields[0]=Number(x)||0
mat.fields[5]=Number(y)||0
mat.fields[10]=Number(z)||0
returnthis.mult(mat)
}
//MultiplybyrotationmatrixaroundXaxis
Transformation.prototype.rotateX=function(angle){
angle=Number(angle)||0
varc=Math.cos(angle)
vars=Math.sin(angle)
varmat=newTransformation()
mat.fields[5]=c
mat.fields[10]=c
mat.fields[9]=-s
mat.fields[6]=s
returnthis.mult(mat)
}
//MultiplybyrotationmatrixaroundYaxis
Transformation.prototype.rotateY=function(angle){
angle=Number(angle)||0
varc=Math.cos(angle)
vars=Math.sin(angle)
varmat=newTransformation()
mat.fields[0]=c
mat.fields[10]=c
mat.fields[2]=-s
mat.fields[8]=s
returnthis.mult(mat)
}
//MultiplybyrotationmatrixaroundZaxis
Transformation.prototype.rotateZ=function(angle){
angle=Number(angle)||0
varc=Math.cos(angle)
vars=Math.sin(angle)
varmat=newTransformation()
mat.fields[0]=c
mat.fields[5]=c
mat.fields[4]=-s
mat.fields[1]=s
returnthis.mult(mat)
}
LookingthroughaCamera
Herecomesthekeypartofpresentingobjectsonthescreen:thecamera.Therearetwokeycomponentstoacamera;namely,itsposition,andhowitprojectsobservedobjectsontothescreen.
Camerapositionishandledwithonesimpletrick.Thereisnovisualdifferencebetweenmovingthecameraameterforward,andmovingthewholeworldameterbackward.Sonaturally,wedothelatter,byapplyingtheinverseofthematrixasatransformation.
Thesecondkeycomponentisthewayobservedobjectsareprojectedontothelens.InWebGL,everythingvisibleonthescreenislocatedinabox.Theboxspansbetween-1and1oneachaxis.Everythingvisibleiswithinthatbox.Wecanusethesameapproachoftransformationmatricestocreateaprojectionmatrix.
OrthographicProjection
Thesimplestprojectionisorthographicprojection.Youtakeaboxinspace,denotingthewidth,heightanddepth,withtheassumptionthatitscenterisatthezeroposition.ThentheprojectionresizestheboxtofititintothepreviouslydescribedboxwithinwhichWebGLobservesobjects.Sincewewanttoresizeeachdimensiontotwo,wescaleeachaxisby2/size,wherebysizeisthedimensionoftherespectiveaxis.Asmallcaveatisthefactthatwe’remultiplyingtheZaxiswithanegative.Thisisdonebecausewewanttoflipthedirectionofthatdimension.Thefinalmatrixhasthisform:
2/width000
02/height00
00-2/depth0
0001
PerspectiveProjection
Wewon’tgothroughthedetailsofhowthisprojectionisdesigned,butjustusethefinalformula,whichisprettymuchstandardbynow.Wecansimplifyitbyplacingtheprojectioninthezeropositiononthexandyaxis,makingtheright/leftandtop/bottomlimitsequaltowidth/2andheight/2respectively.Theparametersnandfrepresentthenearandfarclippingplanes,whicharethesmallestandlargestdistanceapointcanbetobecapturedbythecamera.Theyarerepresentedbytheparallelsidesofthefrustumintheaboveimage.
Aperspectiveprojectionisusuallyrepresentedwithafieldofview(we’llusetheverticalone),aspectratio,andthenearandfarplanedistances.Thatinformationcanbeusedtocalculatewidthandheight,andthenthematrixcanbecreatedfromthefollowingtemplate:
2*n/width000
02*n/height00
00(f+n)/(n-f)2*f*n/(n-f)
00-10
Tocalculatethewidthandheight,thefollowingformulascanbeused:
height=2*near*Math.tan(fov*Math.PI/360)
width=aspectRatio*height
TheFOV(fieldofview)representstheverticalanglethatthecameracaptureswithitslens.Theaspectratiorepresentstheratiobetweenimagewidthandheight,andisbasedonthedimensionsofthescreenwe’rerenderingto.
Implementation
Nowwecanrepresentacameraasaclassthatstoresthecamerapositionandprojectionmatrix.Wealsoneedtoknowhowtocalculateinversetransformations.Solvinggeneralmatrixinversionscanbeproblematic,butthereisasimplifiedapproachforourspecialcase.
functionCamera(){
this.position=newTransformation()
this.projection=newTransformation()
}
Camera.prototype.setOrthographic=function(width,height,depth){
this.projection=newTransformation()
this.projection.fields[0]=2/width
this.projection.fields[5]=2/height
this.projection.fields[10]=-2/depth
}
Camera.prototype.setPerspective=function(verticalFov,aspectRatio,near,far){
varheight_div_2n=Math.tan(verticalFov*Math.PI/360)
varwidth_div_2n=aspectRatio*height_div_2n
this.projection=newTransformation()
this.projection.fields[0]=1/height_div_2n
this.projection.fields[5]=1/width_div_2n
this.projection.fields[10]=(far+near)/(near-far)
this.projection.fields[10]=-1
this.projection.fields[14]=2*far*near/(near-far)
this.projection.fields[15]=0
}
Camera.prototype.getInversePosition=function(){
varorig=this.position.fields
vardest=newTransformation()
varx=orig[12]
vary=orig[13]
varz=orig[14]
//Transposetherotationmatrix
for(vari=0;i<3;++i){
for(varj=0;j<3;++j){
dest.fields[i*4+j]=orig[i+j*4]
}
}
//Translationby-pwillapplyR^T,whichisequaltoR^-1
returndest.translate(-x,-y,-z)
}
Thisisthefinalpieceweneedbeforewecanstartdrawingthingsonthescreen.
DrawinganObjectwiththeWebGLGraphicsPipeline
Thesimplestsurfaceyoucandrawisatriangle.Infact,themajorityofthingsthatyoudrawin3Dspaceconsistofagreatnumberoftriangles.
ThefirstthingthatyouneedtounderstandishowthescreenisrepresentedinWebGL.Itisa3Dspace,spanningbetween-1and1onthex,y,andzaxis.Bydefaultthiszaxisisnotused,butyouareinterestedin3Dgraphics,soyou’llwanttoenableitrightaway.
Havingthatinmind,whatfollowsarethreestepsrequiredtodrawatriangleontothissurface.
Youcandefinethreevertices,whichwouldrepresentthetriangleyouwanttodraw.YouserializethatdataandsenditovertotheGPU(graphicsprocessingunit).Withawholemodelavailable,youcandothatforallthetrianglesinthemodel.Thevertexpositionsyougiveareinthelocalcoordinatespaceofthemodelyou’veloaded.Putsimply,thepositionsyouprovidearetheexactonesfromthefile,andnottheoneyougetafterperformingmatrixtransformations.
Nowthatyou’vegiventheverticestotheGPU,youtelltheGPUwhatlogictousewhenplacingtheverticesontothescreen.Thisstepwillbeusedtoapplyourmatrixtransformations.TheGPUisverygoodatmultiplyingalotof4x4matrices,sowe’llputthatabilitytogooduse.
Inthelaststep,theGPUwillrasterizethattriangle.Rasterizationistheprocessoftakingvectorgraphicsanddeterminingwhichpixelsofthescreenneedtobepaintedforthatvectorgraphicsobjecttobedisplayed.Inourcase,theGPUistryingtodeterminewhichpixelsarelocatedwithineachtriangle.Foreachpixel,theGPUwillaskyouwhatcoloryouwantittobepainted.
Thesearethefourelementsneededtodrawanythingyouwant,andtheyarethesimplestexampleofagraphicspipeline.Whatfollowsisalookateachofthem,andasimpleimplementation.
TheDefaultFramebuffer
ThemostimportantelementforaWebGLapplicationistheWebGLcontext.Youcanaccessitwithgl=canvas.getContext('webgl'),oruse'experimental-webgl'asafallback,incasethecurrentlyusedbrowserdoesn’tsupportallWebGLfeaturesyet.ThecanvaswereferredtoistheDOMelementofthecanvaswewanttodrawon.Thecontextcontainsmanythings,amongwhichisthedefaultframebuffer.
Youcouldlooselydescribeaframebufferasanybuffer(object)thatyoucandrawon.Bydefault,thedefaultframebufferstoresthecolorforeachpixelofthecanvasthattheWebGLcontextisboundto.Asdescribedintheprevioussection,whenwedrawontheframebuffer,eachpixelislocatedbetween-1and1onthexandyaxis.Somethingwealsomentionedisthefactthat,bydefault,WebGLdoesn’tusethezaxis.Thatfunctionalitycanbeenabledbyrunninggl.enable(gl.DEPTH_TEST).Great,butwhatisadepthtest?
Enablingthedepthtestallowsapixeltostorebothcoloranddepth.Thedepthisthezcoordinateofthatpixel.Afteryoudrawtoapixelatacertaindepthz,toupdatethecolorofthatpixel,youneedtodrawatazpositionthatisclosertothecamera.Otherwise,thedrawattemptwillbeignored.Thisallowsfortheillusionof3D,sincedrawingobjectsthatarebehindotherobjectswillcausethoseobjectstobeoccludedbyobjectsinfrontofthem.
Anydrawsyouperformstayonthescreenuntilyoutellthemtogetcleared.Todoso,youhavetocallgl.clear(gl.COLOR_BUFFER_BIT|gl.DEPTH_BUFFER_BIT).Thisclearsboththecoloranddepthbuffer.Topickthecolorthattheclearedpixelsaresetto,usegl.clearColor(red,green,blue,alpha).
Let’screatearendererthatusesacanvasandclearsituponrequest:
functionRenderer(canvas){
vargl=canvas.getContext('webgl')||canvas.getContext('experimental-webgl')
gl.enable(gl.DEPTH_TEST)
this.gl=gl
}
Renderer.prototype.setClearColor=function(red,green,blue){
gl.clearColor(red/255,green/255,blue/255,1)
}
Renderer.prototype.getContext=function(){
returnthis.gl
}
Renderer.prototype.render=function(){
this.gl.clear(gl.COLOR_BUFFER_BIT|gl.DEPTH_BUFFER_BIT)
}
varrenderer=newRenderer(document.getElementById('webgl-canvas'))
renderer.setClearColor(100,149,237)
loop()
functionloop(){
renderer.render()
requestAnimationFrame(loop)
}
AttachingthisscripttothefollowingHTMLwillgiveyouabrightbluerectangleonthescreen
TherequestAnimationFramecallcausesthelooptobecalledagainassoonasthepreviousframeisdonerenderingandalleventhandlingisfinished.
VertexBufferObjects
Thefirstthingyouneedtodoisdefinetheverticesthatyouwanttodraw.Youcandothatbydescribingthemviavectorsin3Dspace.Afterthat,youwanttomovethatdataintotheGPURAM,bycreatinganewVertexBufferObject(VBO).
ABufferObjectingeneralisanobjectthatstoresanarrayofmemorychunksontheGPU.ItbeingaVBOjustdenoteswhattheGPUcanusethememoryfor.Mostofthetime,BufferObjectsyoucreatewillbeVBOs.
YoucanfilltheVBObytakingallNverticesthatwehaveandcreatinganarrayoffloatswith3NelementsforthevertexpositionandvertexnormalVBOs,and2NforthetexturecoordinatesVBO.Eachgroupofthreefloats,ortwofloatsforUVcoordinates,representsindividualcoordinatesofavertex.ThenwepassthesearraystotheGPU,andourverticesarereadyfortherestofthepipeline.
SincethedataisnowontheGPURAM,youcandeleteitfromthegeneralpurposeRAM.Thatis,unlessyouwanttolateronmodifyit,anduploaditagain.Eachmodificationneedstobefollowedbyanupload,sincemodificationsinourJSarraysdon’tapplytoVBOsintheactualGPURAM.
Belowisacodeexamplethatprovidesallofthedescribedfunctionality.AnimportantnotetomakeisthefactthatvariablesstoredontheGPUarenotgarbagecollected.Thatmeansthatwehavetomanuallydeletethemoncewedon’twanttousethemanymore.Wewilljustgiveyouanexampleforhowthatisdonehere,andwillnotfocusonthatconceptfurtheron.DeletingvariablesfromtheGPUisnecessaryonlyifyouplantostopusingcertaingeometrythroughouttheprogram.
WealsoaddedserializationtoourGeometryclassandelementswithinit.
Geometry.prototype.vertexCount=function(){
returnthis.faces.length*3
}
Geometry.prototype.positions=function(){
varanswer=[]
this.faces.forEach(function(face){
face.vertices.forEach(function(vertex){
varv=vertex.position
answer.push(v.x,v.y,v.z)
})
})
returnanswer
}
Geometry.prototype.normals=function(){
varanswer=[]
this.faces.forEach(function(face){
face.vertices.forEach(function(vertex){
varv=vertex.normal
answer.push(v.x,v.y,v.z)
})
})
returnanswer
}
Geometry.prototype.uvs=function(){
varanswer=[]
this.faces.forEach(function(face){
face.vertices.forEach(function(vertex){
varv=vertex.uv
answer.push(v.x,v.y)
})
})
returnanswer
}
////////////////////////////////
functionVBO(gl,data,count){
//CreatesbufferobjectinGPURAMwherewecanstoreanything
varbufferObject=gl.createBuffer()
//TellwhichbufferobjectwewanttooperateonasaVBO
gl.bindBuffer(gl.ARRAY_BUFFER,bufferObject)
//Writethedata,andsettheflagtooptimize
//forrarechangestothedatawe'rewriting
gl.bufferData(gl.ARRAY_BUFFER,newFloat32Array(data),gl.STATIC_DRAW)
this.gl=gl
this.size=data.length/count
this.count=count
this.data=bufferObject
}
VBO.prototype.destroy=function(){
//Freememorythatisoccupiedbyourbufferobject
this.gl.deleteBuffer(this.data)
}
TheVBOdatatypegeneratestheVBOinthepassedWebGLcontext,basedonthearraypassedasasecondparameter.
Youcanseethreecallstotheglcontext.ThecreateBuffer()callcreatesthebuffer.ThebindBuffer()calltellstheWebGLstatemachinetousethisspecificmemoryasthecurrentVBO(ARRAY_BUFFER)forallfutureoperations,untiltoldotherwise.Afterthat,wesetthevalueofthecurrentVBOtotheprovideddata,withbufferData().
WealsoprovideadestroymethodthatdeletesourbufferobjectfromtheGPURAM,byusingdeleteBuffer().
YoucanusethreeVBOsandatransformationtodescribeallthepropertiesofamesh,togetherwithitsposition.
functionMesh(gl,geometry){
varvertexCount=geometry.vertexCount()
this.positions=newVBO(gl,geometry.positions(),vertexCount)
this.normals=newVBO(gl,geometry.normals(),vertexCount)
this.uvs=newVBO(gl,geometry.uvs(),vertexCount)
this.vertexCount=vertexCount
this.position=newTransformation()
this.gl=gl
}
Mesh.prototype.destroy=function(){
this.positions.destroy()
this.normals.destroy()
this.uvs.destroy()
}
Asanexample,hereishowwecanloadamodel,storeitspropertiesinthemesh,andthendestroyit:
Geometry.loadOBJ('/assets/model.obj').then(function(geometry){
varmesh=newMesh(gl,geometry)
console.log(mesh)
mesh.destroy()
})
Shaders
Whatfollowsisthepreviouslydescribedtwo-stepprocessofmovingpointsintodesiredpositionsandpaintingallindividualpixels.Todothis,wewriteaprogramthatisrunonthegraphicscardmanytimes.Thisprogramtypicallyconsistsofatleasttwoparts.ThefirstpartisaVertexShader,whichisrunforeachvertex,andoutputswhereweshouldplacethevertexonthescreen,amongotherthings.ThesecondpartistheFragmentShader,whichisrunforeachpixelthatatrianglecoversonthescreen,andoutputsthecolorthatpixelshouldbepaintedto.
VertexShaders
Let’ssayyouwanttohaveamodelthatmovesaroundleftandrightonthescreen.Inanaiveapproach,youcouldupdatethepositionofeachvertexandresendittotheGPU.Thatprocessisexpensiveandslow.Alternatively,youwouldgiveaprogramfortheGPUtorunforeachvertex,anddoallthoseoperationsinparallelwithaprocessorthatisbuiltfordoingexactlythatjob.Thatistheroleofavertexshader.
Avertexshaderisthepartoftherenderingpipelinethatprocessesindividualvertices.Acalltothevertexshaderreceivesasinglevertexandoutputsasinglevertexafterallpossibletransformationstothevertexareapplied.
ShadersarewritteninGLSL.Therearealotofuniqueelementstothislanguage,butmostofthesyntaxisveryC-like,soitshouldbeunderstandabletomostpeople.
Therearethreetypesofvariablesthatgoinandoutofavertexshader,andallofthemserveaspecificuse:
attribute—Theseareinputsthatholdspecificpropertiesofavertex.Previously,wedescribedthepositionofavertexasanattribute,intheformofathree-elementvector.Youcanlookatattributesasvaluesthatdescribeonevertex.
uniform—Theseareinputsthatarethesameforeveryvertexwithinthesamerenderingcall.Let’ssaythatwewanttobeabletomoveourmodelaround,bydefiningatransformationmatrix.Youcanuseauniformvariabletodescribethat.YoucanpointtoresourcesontheGPUaswell,liketextures.Youcanlookatuniformsasvaluesthatdescribeamodel,orapartofamodel.
varying—Theseareoutputsthatwepasstothefragmentshader.Sincetherearepotentiallythousandsofpixelsforatriangleofvertices,eachpixelwillreceiveaninterpolatedvalueforthisvariable,dependingontheposition.Soifonevertexsends500asanoutput,andanotherone100,apixelthatisinthemiddlebetweenthemwillreceive300asaninputforthatvariable.Youcanlookatvaryingsasvaluesthatdescribesurfacesbetweenvertices.
So,let’ssayyouwanttocreateavertexshaderthatreceivesaposition,normal,anduvcoordinatesforeachvertex,andaposition,view(inversecameraposition),andprojectionmatrixforeachrenderedobject.Let’ssayyoualsowanttopaintindividualpixelsbasedontheiruvcoordinatesandtheirnormals.“Howwouldthatcodelook?”youmightask.
attributevec3position;
attributevec3normal;
attributevec2uv;
uniformmat4model;
uniformmat4view;
uniformmat4projection;
varyingvec3vNormal;
varyingvec2vUv;
voidmain(){
vUv=uv;
vNormal=(model*vec4(normal,0.)).xyz;
gl_Position=projection*view*model*vec4(position,1.);
}
Mostoftheelementshereshouldbeself-explanatory.Thekeythingtonoticeisthefactthattherearenoreturnvaluesinthemainfunction.Allvaluesthatwewouldwanttoreturnareassigned,eithertovaryingvariables,ortospecialvariables.Hereweassigntogl_Position,whichisafour-dimensionalvector,wherebythelastdimensionshouldalwaysbesettoone.Anotherstrangethingyoumightnoticeisthewayweconstructavec4outofthepositionvector.Youcanconstructavec4byusingfourfloats,twovec2s,oranyothercombinationthatresultsinfourelements.Therearealotofseeminglystrangetypecastingswhichmakeperfectsenseonceyou’refamiliarwithtransformationmatrices.
Youcanalsoseethatherewecanperformmatrixtransformationsextremelyeasily.GLSLisspecificallymadeforthiskindofwork.Theoutputpositioniscalculatedbymultiplyingtheprojection,view,andmodelmatrixandapplyingitontotheposition.Theoutputnormalisjusttransformedtotheworldspace.We’llexplainlaterwhywe’vestoppedtherewiththenormaltransformations.
Fornow,wewillkeepitsimple,andmoveontopaintingindividualpixels.
FragmentShaders
Afragmentshaderisthestepafterrasterizationinthegraphicspipeline.Itgeneratescolor,depth,andotherdataforeverypixeloftheobjectthatisbeingpainted.
Theprinciplesbehindimplementingfragmentshadersareverysimilartovertexshaders.Therearethreemajordifferences,though:
Therearenomorevaryingoutputs,andattributeinputshavebeenreplacedwithvaryinginputs.Wehavejustmovedoninourpipeline,andthingsthataretheoutputinthevertexshaderarenowinputsinthefragmentshader.
Ouronlyoutputnowisgl_FragColor,whichisavec4.Theelementsrepresentred,green,blue,andalpha(RGBA),respectively,withvariablesinthe0to1range.Youshouldkeepalphaat1,unlessyou’redoingtransparency.Transparencyisafairlyadvancedconceptthough,sowe’llsticktoopaqueobjects.
Atthebeginningofthefragmentshader,youneedtosetthefloatprecision,whichisimportantforinterpolations.Inalmostallcases,juststicktothelinesfromthefollowingshader.
Withthatinmind,youcaneasilywriteashaderthatpaintstheredchannelbasedontheUposition,greenchannelbasedontheVposition,andsetsthebluechanneltomaximum.
#ifdefGL_ES
precisionhighpfloat;
#endif
varyingvec3vNormal;
varyingvec2vUv;
voidmain(){
vec2clampedUv=clamp(vUv,0.,1.);
gl_FragColor=vec4(clampedUv,1.,1.);
}
Thefunctionclampjustlimitsallfloatsinanobjecttobewithinthegivenlimits.Therestofthecodeshouldbeprettystraightforward.
Withallofthisinmind,allthatisleftistoimplementthisinWebGL.
CombiningShadersintoaProgram
Thenextstepistocombinetheshadersintoaprogram:
functionShaderProgram(gl,vertSrc,fragSrc){
varvert=gl.createShader(gl.VERTEX_SHADER)
gl.shaderSource(vert,vertSrc)
gl.compileShader(vert)
if(!gl.getShaderParameter(vert,gl.COMPILE_STATUS)){
console.error(gl.getShaderInfoLog(vert))
thrownewError('Failedtocompileshader')
}
varfrag=gl.createShader(gl.FRAGMENT_SHADER)
gl.shaderSource(frag,fragSrc)
gl.compileShader(frag)
if(!gl.getShaderParameter(frag,gl.COMPILE_STATUS)){
console.error(gl.getShaderInfoLog(frag))
thrownewError('Failedtocompileshader')
}
varprogram=gl.createProgram()
gl.attachShader(program,vert)
gl.attachShader(program,frag)
gl.linkProgram(program)
if(!gl.getProgramParameter(program,gl.LINK_STATUS)){
console.error(gl.getProgramInfoLog(program))
thrownewError('Failedtolinkprogram')
}
this.gl=gl
this.position=gl.getAttribLocation(program,'position')
this.normal=gl.getAttribLocation(program,'normal')
this.uv=gl.getAttribLocation(program,'uv')
this.model=gl.getUniformLocation(program,'model')
this.view=gl.getUniformLocation(program,'view')
this.projection=gl.getUniformLocation(program,'projection')
this.vert=vert
this.frag=frag
this.program=program
}
//LoadsshaderfilesfromthegivenURLs,andreturnsaprogramasapromise
ShaderProgram.load=function(gl,vertUrl,fragUrl){
returnPromise.all([loadFile(vertUrl),loadFile(fragUrl)]).then(function(files){
returnnewShaderProgram(gl,files[0],files[1])
})
functionloadFile(url){
returnnewPromise(function(resolve){
varxhr=newXMLHttpRequest()
xhr.onreadystatechange=function(){
if(xhr.readyState==XMLHttpRequest.DONE){
resolve(xhr.responseText)
}
}
xhr.open('GET',url,true)
xhr.send(null)
})
}
}
Thereisn’tmuchtosayaboutwhat’shappeninghere.Eachshadergetsassignedastringasasourceandcompiled,afterwhichwechecktoseeiftherewerecompilationerrors.Then,wecreateaprogrambylinkingthesetwoshaders.Finally,westorepointerstoallrelevantattributesanduniformsforposterity.
ActuallyDrawingtheModel
Last,butnotleast,youdrawthemodel.
Firstyoupicktheshaderprogramyouwanttouse.
ShaderProgram.prototype.use=function(){
this.gl.useProgram(this.program)
}
ThenyousendallthecamerarelateduniformstotheGPU.Theseuniformschangeonlyoncepercamerachangeormovement.
Transformation.prototype.sendToGpu=function(gl,uniform,transpose){
gl.uniformMatrix4fv(uniform,transpose||false,newFloat32Array(this.fields))
}
Camera.prototype.use=function(shaderProgram){
this.projection.sendToGpu(shaderProgram.gl,shaderProgram.projection)
this.getInversePosition().sendToGpu(shaderProgram.gl,shaderProgram.view)
}
Finally,youtakethetransformationsandVBOsandassignthemtouniformsandattributes,respectively.SincethishastobedonetoeachVBO,youcancreateitsdatabindingasamethod.
VBO.prototype.bindToAttribute=function(attribute){
vargl=this.gl
//TellwhichbufferobjectwewanttooperateonasaVBO
gl.bindBuffer(gl.ARRAY_BUFFER,this.data)
//Enablethisattributeintheshader
gl.enableVertexAttribArray(attribute)
//Defineformatoftheattributearray.Mustmatchparametersinshader
gl.vertexAttribPointer(attribute,this.size,gl.FLOAT,false,0,0)
}
Thenyouassignanarrayofthreefloatstotheuniform.Eachuniformtypehasadifferentsignature,sodocumentationandmoredocumentationareyourfriendshere.Finally,youdrawthetrianglearrayonthescreen.YoutellthedrawingcalldrawArrays()fromwhichvertextostart,andhowmanyverticestodraw.ThefirstparameterpassedtellsWebGLhowitshallinterpretthearrayofvertices.UsingTRIANGLEStakesthreebythreeverticesanddrawsatriangleforeachtriplet.UsingPOINTSwouldjustdrawapointforeachpassedvertex.Therearemanymoreoptions,butthereisnoneedtodiscovereverythingatonce.Belowisthecodefordrawinganobject:
Mesh.prototype.draw=function(shaderProgram){
this.positions.bindToAttribute(shaderProgram.position)
this.normals.bindToAttribute(shaderProgram.normal)
this.uvs.bindToAttribute(shaderProgram.uv)
this.position.sendToGpu(this.gl,shaderProgram.model)
this.gl.drawArrays(this.gl.TRIANGLES,0,this.vertexCount)
}
Therendererneedstobeextendedabittoaccommodatealltheextraelementsthatneedtobehandled.Itshouldbepossibletoattachashaderprogram,andtorenderanarrayofobjectsbasedonthecurrentcameraposition.
Renderer.prototype.setShader=function(shader){
this.shader=shader
}
Renderer.prototype.render=function(camera,objects){
this.gl.clear(gl.COLOR_BUFFER_BIT|gl.DEPTH_BUFFER_BIT)
varshader=this.shader
if(!shader){
return
}
shader.use()
camera.use(shader)
objects.forEach(function(mesh){
mesh.draw(shader)
})
}
Wecancombinealltheelementsthatwehavetofinallydrawsomethingonthescreen:
varrenderer=newRenderer(document.getElementById('webgl-canvas'))
renderer.setClearColor(100,149,237)
vargl=renderer.getContext()
varobjects=[]
Geometry.loadOBJ('/assets/sphere.obj').then(function(data){
objects.push(newMesh(gl,data))
})
ShaderProgram.load(gl,'/shaders/basic.vert','/shaders/basic.frag')
.then(function(shader){
renderer.setShader(shader)
})
varcamera=newCamera()
camera.setOrthographic(16,10,10)
loop()
functionloop(){
renderer.render(camera,objects)
requestAnimationFrame(loop)
}
Thislooksabitrandom,butyoucanseethedifferentpatchesofthesphere,basedonwheretheyareontheUVmap.Youcanchangetheshadertopainttheobjectbrown.JustsetthecolorforeachpixeltobetheRGBAforbrown:
#ifdefGL_ES
precisionhighpfloat;
#endif
varyingvec3vNormal;
varyingvec2vUv;
voidmain(){
vec3brown=vec3(.54,.27,.07);
gl_FragColor=vec4(brown,1.);
}
Itdoesn’tlookveryconvincing.Itlookslikethesceneneedssomeshadingeffects.
AddingLight
Lightsandshadowsarethetoolsthatallowustoperceivetheshapeofobjects.Lightscomeinmanyshapesandsizes:spotlightsthatshineinonecone,lightbulbsthatspreadlightinalldirections,andmostinterestingly,thesun,whichissofarawaythatallthelightitshinesonusradiates,forallintentsandpurposes,inthesamedirection.
Sunlightsoundslikeit’sthesimplesttoimplement,sinceallyouneedtoprovideisthedirectioninwhichallraysspread.Foreachpixelthatyoudrawonthescreen,youchecktheangleunderwhichthelighthitstheobject.Thisiswherethesurfacenormalscomein.
Youcanseeallthelightraysflowinginthesamedirection,andhittingthesurfaceunderdifferentangles,whicharebasedontheanglebetweenthelightrayandthesurfacenormal.Themoretheycoincide,thestrongerthelightis.
Ifyouperformadotproductbetweenthenormalizedvectorsforthelightrayandthesurfacenormal,youwillget-1iftherayhitsthesurfaceperfectlyperpendicularly,0iftherayisparalleltothesurface,and1ifitilluminatesitfromtheoppositeside.Soanythingbetween0and1shouldaddnolight,whilenumbersbetween0and-1shouldgraduallyincreasetheamountoflighthittingtheobject.Youcantestthisbyaddingafixedlightintheshadercode.
#ifdefGL_ES
precisionhighpfloat;
#endif
varyingvec3vNormal;
varyingvec2vUv;
voidmain(){
vec3brown=vec3(.54,.27,.07);
vec3sunlightDirection=vec3(-1.,-1.,-1.);
floatlightness=-clamp(dot(normalize(vNormal),normalize(sunlightDirection)),-1.,0.);
gl_FragColor=vec4(brown*lightness,1.);
}
Wesetthesuntoshineintheforward-left-downdirection.Youcanseehowsmooththeshadingis,eventhoughthemodelisveryjagged.Youcanalsonoticehowdarkthebottom-leftsideis.Wecanaddalevelofambientlight,whichwillmaketheareaintheshadowbrighter.
#ifdefGL_ES
precisionhighpfloat;
#endif
varyingvec3vNormal;
varyingvec2vUv;
voidmain(){
vec3brown=vec3(.54,.27,.07);
vec3sunlightDirection=vec3(-1.,-1.,-1.);
floatlightness=-clamp(dot(normalize(vNormal),normalize(sunlightDirection)),-1.,0.);
floatambientLight=0.3;
lightness=ambientLight+(1.-ambientLight)*lightness;
gl_FragColor=vec4(brown*lightness,1.);
}
Youcanachievethissameeffectbyintroducingalightclass,whichstoresthelightdirectionandambientlightintensity.Thenyoucanchangethefragmentshadertoaccommodatethataddition.
Nowtheshaderbecomes:
#ifdefGL_ES
precisionhighpfloat;
#endif
uniformvec3lightDirection;
uniformfloatambientLight;
varyingvec3vNormal;
varyingvec2vUv;
voidmain(){
vec3brown=vec3(.54,.27,.07);
floatlightness=-clamp(dot(normalize(vNormal),normalize(lightDirection)),-1.,0.);
lightness=ambientLight+(1.-ambientLight)*lightness;
gl_FragColor=vec4(brown*lightness,1.);
}
Thenyoucandefinethelight:
functionLight(){
this.lightDirection=newVector3(-1,-1,-1)
this.ambientLight=0.3
}
Light.prototype.use=function(shaderProgram){
vardir=this.lightDirection
vargl=shaderProgram.gl
gl.uniform3f(shaderProgram.lightDirection,dir.x,dir.y,dir.z)
gl.uniform1f(shaderProgram.ambientLight,this.ambientLight)
}
Intheshaderprogramclass,addtheneededuniforms:
this.ambientLight=gl.getUniformLocation(program,'ambientLight')
this.lightDirection=gl.getUniformLocation(program,'lightDirection')
Intheprogram,addacalltothenewlightintherenderer:
Renderer.prototype.render=function(camera,light,objects){
this.gl.clear(gl.COLOR_BUFFER_BIT|gl.DEPTH_BUFFER_BIT)
varshader=this.shader
if(!shader){
return
}
shader.use()
light.use(shader)
camera.use(shader)
objects.forEach(function(mesh){
mesh.draw(shader)
})
}
Theloopwillthenchangeslightly:
varlight=newLight()
loop()
functionloop(){
renderer.render(camera,light,objects)
requestAnimationFrame(loop)
}
Ifyou’vedoneeverythingright,thentherenderedimageshouldbethesameasitwasinthelastimage.
Afinalsteptoconsiderwouldbeaddinganactualtexturetoourmodel.Let’sdothatnow.
AddingTextures
HTML5hasgreatsupportforloadingimages,sothereisnoneedtodocrazyimageparsing.ImagesarepassedtoGLSLassampler2Dbytellingtheshaderwhichoftheboundtexturestosample.Thereisalimitednumberoftexturesonecouldbind,andthelimitisbasedonthehardwareused.Asampler2Dcanbequeriedforcolorsatcertainpositions.ThisiswhereUVcoordinatescomein.Hereisanexamplewherewereplacedbrownwithsampledcolors.
#ifdefGL_ES
precisionhighpfloat;
#endif
uniformvec3lightDirection;
uniformfloatambientLight;
uniformsampler2Ddiffuse;
varyingvec3vNormal;
varyingvec2vUv;
voidmain(){
floatlightness=-clamp(dot(normalize(vNormal),normalize(lightDirection)),-1.,0.);
lightness=ambientLight+(1.-ambientLight)*lightness;
gl_FragColor=vec4(texture2D(diffuse,vUv).rgb*lightness,1.);
}
Thenewuniformhastobeaddedtothelistingintheshaderprogram:
this.diffuse=gl.getUniformLocation(program,'diffuse')
Finally,we’llimplementtextureloading.Aspreviouslysaid,HTML5providesfacilitiesforloadingimages.AllweneedtodoissendtheimagetotheGPU:
functionTexture(gl,image){
vartexture=gl.createTexture()
//Setthenewlycreatedtexturecontextasactivetexture
gl.bindTexture(gl.TEXTURE_2D,texture)
//Settextureparameters,andpasstheimagethatthetextureisbasedon
gl.texImage2D(gl.TEXTURE_2D,0,gl.RGBA,gl.RGBA,gl.UNSIGNED_BYTE,image)
//Setfilteringmethods
//Veryoftenshaderswillquerythetexturevaluebetweenpixels,
//andthisisinstructinghowthatvalueshallbecalculated
gl.texParameteri(gl.TEXTURE_2D,gl.TEXTURE_MAG_FILTER,gl.LINEAR)
gl.texParameteri(gl.TEXTURE_2D,gl.TEXTURE_MIN_FILTER,gl.LINEAR)
this.data=texture
this.gl=gl
}
Texture.prototype.use=function(uniform,binding){
binding=Number(binding)||0
vargl=this.gl
//Wecanbindmultipletextures,andherewepickwhichofthebindings
//we'resettingrightnow
gl.activeTexture(gl['TEXTURE'+binding])
//Afterpickingthebinding,wesetthetexture
gl.bindTexture(gl.TEXTURE_2D,this.data)
//Finally,wepasstotheuniformthebindingIDwe'veused
gl.uniform1i(uniform,binding)
//Theprevious3linesareequivalentto:
//texture[i]=this.data
//uniform=i
}
Texture.load=function(gl,url){
returnnewPromise(function(resolve){
varimage=newImage()
image.onload=function(){
resolve(newTexture(gl,image))
}
image.src=url
})
}
TheprocessisnotmuchdifferentfromtheprocessusedtoloadandbindVBOs.Themaindifferenceisthatwe’renolongerbindingtoanattribute,butratherbindingtheindexofthetexturetoanintegeruniform.Thesampler2Dtypeisnothingmorethanapointeroffsettoatexture.
NowallthatneedstobedoneisextendtheMeshclass,tohandletexturesaswell:
functionMesh(gl,geometry,texture){//addedtexture
varvertexCount=geometry.vertexCount()
this.positions=newVBO(gl,geometry.positions(),vertexCount)
this.normals=newVBO(gl,geometry.normals(),vertexCount)
this.uvs=newVBO(gl,geometry.uvs(),vertexCount)
this.texture=texture//new
this.vertexCount=vertexCount
this.position=newTransformation()
this.gl=gl
}
Mesh.prototype.destroy=function(){
this.positions.destroy()
this.normals.destroy()
this.uvs.destroy()
}
Mesh.prototype.draw=function(shaderProgram){
this.positions.bindToAttribute(shaderProgram.position)
this.normals.bindToAttribute(shaderProgram.normal)
this.uvs.bindToAttribute(shaderProgram.uv)
this.position.sendToGpu(this.gl,shaderProgram.model)
this.texture.use(shaderProgram.diffuse,0)//new
this.gl.drawArrays(this.gl.TRIANGLES,0,this.vertexCount)
}
Mesh.load=function(gl,modelUrl,textureUrl){//new
vargeometry=Geometry.loadOBJ(modelUrl)
vartexture=Texture.load(gl,textureUrl)
returnPromise.all([geometry,texture]).then(function(params){
returnnewMesh(gl,params[0],params[1])
})
}
Andthefinalmainscriptwouldlookasfollows:
varrenderer=newRenderer(document.getElementById('webgl-canvas'))
renderer.setClearColor(100,149,237)
vargl=renderer.getContext()
varobjects=[]
Mesh.load(gl,'/assets/sphere.obj','/assets/diffuse.png')
.then(function(mesh){
objects.push(mesh)
})
ShaderProgram.load(gl,'/shaders/basic.vert','/shaders/basic.frag')
.then(function(shader){
renderer.setShader(shader)
})
varcamera=newCamera()
camera.setOrthographic(16,10,10)
varlight=newLight()
loop()
functionloop(){
renderer.render(camera,light,objects)
requestAnimationFrame(loop)
}
Evenanimatingcomeseasyatthispoint.Ifyouwantedthecameratospinaroundourobject,youcandoitbyjustaddingonelineofcode:
functionloop(){
renderer.render(camera,light,objects)
camera.position=camera.position.rotateY(Math.PI/120)
requestAnimationFrame(loop)
}
Feelfreetoplayaroundwithshaders.Addingonelineofcodewillturnthisrealisticlightingintosomethingcartoonish.
voidmain(){
floatlightness=-clamp(dot(normalize(vNormal),normalize(lightDirection)),-1.,0.);
lightness=lightness>0.1?1.:0.;//new
lightness=ambientLight+(1.-ambientLight)*lightness;
gl_FragColor=vec4(texture2D(diffuse,vUv).rgb*lightness,1.);
}
It’sassimpleastellingthelightingtogointoitsextremesbasedonwhetheritcrossedasetthreshold.
WheretoGoNext
TherearemanysourcesofinformationforlearningallthetricksandintricaciesofWebGL.Andthebestpartisthatifyoucan’tfindananswerthatrelatestoWebGL,youcanlookforitinOpenGL,sinceWebGLisprettymuchbasedonasubsetofOpenGL,withsomenamesbeingchanged.
Innoparticularorder,herearesomegreatsourcesformoredetailedinformation,bothforWebGLandOpenGL.
WebGLFundamentals
LearningWebGL
AverydetailedOpenGLtutorialguidingyouthroughallthefundamentalprinciplesdescribedhere,inaveryslowanddetailedway.
Andtherearemany,manyothersitesdedicatedtoteachingyoutheprinciplesofcomputergraphics.
MDNdocumentationforWebGL
KhronosWebGL1.0specificationforifyou’reinterestedinunderstandingthemoretechnicaldetailsofhowtheWebGLAPIshouldworkinalledgecases.
UnderstandingthebasicsWhatisWebGL?WebGLisaJavaScriptAPIthataddsnativesupportforrendering3Dgraphicswithincompatiblewebbrowsers,throughanAPIsimilartoOpenGL.Howare3Dmodelsrepresentedinmemory?Themostcommonwaytorepresent3Dmodelsisthroughanarrayofvertices,eachhavingadefinedpositioninspace,normalofthesurfacethatthevertexshouldbeapartof,andcoordinatesonatextureusedtopaintthemodel.Theseverticesarethenputingroupsofthree,toformtriangles.Whatisavertexshader?Avertexshaderisthepartoftherenderingpipelinethatprocessesindividualvertices.Acalltothevertexshaderreceivesasinglevertex,andoutputsasinglevertexafterallpossibletransformationstothevertexareapplied.Thisallowsapplyingmovementanddeformationstowholeobjects.Whatisafragmentshader?Afragmentshaderisthepartoftherenderingpipelinethattakesapixelofanobjectonthescreen,togetherwithpropertiesoftheobjectatthatpixelposition,andcangeneratescolor,depthandotherdataforit.Howdoobjectsmoveina3Dscene?Allvertexpositionsofanobjectarerelativetoitslocalcoordinatesystem,whichisrepresentedwitha4x4identitymatrix.Ifwemovethatcoordinatesystem,bymultiplyingitwithtransformationmatrices,theobject'sverticesmovewithit.TagsTutorialWebGL3DGraphicsFreelancer?Findyournextjob.JavaScriptDeveloperJobsViewfullprofileAdnanAdemovicFreelanceSoftwareEngineerAbouttheauthorAdnanisanengineerwithexperienceindesktop,embedded,anddistributedsystems.HehasworkedextensivelyinC++(workingatlowandhighlevels),Python,andotherlanguages.Heisexperiencedindevelopingcomplexalgorithmsandintelligentsystems,optimization,networking,andsignalprocessing.Heisverypassionateaboutproblemsfacedbyroboticsandmechatronicsingeneral.HireAdnanCommentsEnohBarbuwebglisnotforeveryone,youmustknowmathandphysicsverywellbeforestartingtodigintoit...JuanPabloCarzolioVeryinterestingandinformative.Thanks!Ibelievethereisaminortypo:shouldn'tthethirdcolumnofthefirstmatrixconsistof"z_ab"componentsinsteadof"y_ab"ones?AdnanAdemovicYouareabsolutelycorrect.Thankyoufortheremark.JasonLantzexcellentarticle...webglcanbeaprettycomplicatedtopic...CarlosDeLunaSaenzThanksforthetutorial...evenwhenihaven'tworkedwithGraphicsinmyprolifeithasbeensomethingiwouldliketodo....thatwillhelpmetousemyoldOpenGLknowdlegetotheWebsitesihave.ThanksagainCarlosDeLunaSaenzIremember(alongtimeago)wheniaskedadesignertodoa"CHiP-electronic-witheyes"walkingindiferentframes...wheretunedtomeavery"silly"cartoonlikegraphic...woiusedPOVRaytodotheanimation...shewasimpressedwhenishowherthat..."It'sawesomeandwithincrediblequelity"shesayd...:)BenHeiseHowwastheUVunwrappingmadeforthesmileyfacethatyoucanpainton?
Manual:byhand,Blender3D(orsome3Dmodelingprogramoranotherway?AdnanAdemovicTheOBJfileandtextureweremadewithBlender,yes.DanDollinsAwesomepost!thanksforsharingoliversmithhttp://www.office-setup-install.us/MatejRajtárThankyouverymuchforthisamazingarticle,itreallyhelpedtolearnalot!Also,IthinkIfoundasmalltypo:intheperspectiveprojectionimplementationatsetPerspectivethereistheprojection.fields[10]valueisassignedtwice,the-1shouldbegoingtoprojection.fields[11]VinayKumarhowthemetalnessandroughnessworksinthis?imeanthetexturemapswhichdefinewetherthemodelinroughorshiny.NickHey!Thanksforthearticle.
IwouldalsoliketomentionthisgreattutorialonSkillshare:https://skl.sh/2shhq00himanshuThenyouassignanarrayofthreefloatstotheuniform.Eachuniformtypehasadifferentsignature,sodocumentationandmoredocumentationareyourfriendshere.Finally,youdrawthetrianglearrayonthescreen.YoutellthedrawingcalldrawArrays()fromwhichvertextostart,andhowmanyverticestodraw.ThefirstparameterpassedtellsWebGLhowitshallinterpretthearrayofvertices.UsingTRIANGLEStakesthreebythreeverticesanddrawsatriangleforeachtriplet.UsingPOINTSwouldjustdrawapointforeachpassedvertex.Therearemanymoreoptions,butthereisnoneedtodiscovereverythingatonce.Belowisthecodefordrawinganobject:
thispartwasreallyhelpfulformeasigottoknownewthingsandtechniquesthanksKseniaStarodubtsevaThanksforthetutorial!Briefandthoroughatthesametime-justgreat.
Aquestionthough.Iwouldliketoaddsomebasicspecularlighttothemodel.What'sthebetterwaytodothatinthissetup?RévO'ConnerMetalnessandroughnessworkbyavalueof0to1,onatexturemapofroughnessthearea(coordinates)wherethetextureimageisblackforroughnesstheobjectwillbeshinyandwherethevalueis0theobjectwillberough.Agreyareawillthrowavaluebetween0and1andmaketheobjectpartiallyshiny.Therearedifferentparametersthatcanbecontrolledthiswayincludingmetalness,spec,opacityetc.AlexJustanawesomearticle!Foundtypoforperspectivecamera
from
this.projection.fields[10]=-1;
to
this.projection.fields[11]=-1;wamdeenIsforeverybodyandisallaboutinterestcommentspoweredbyDisqusWorld-classarticles,deliveredweekly.GetgreatcontentSubscriptionimpliesconsenttoourprivacypolicyThankyou!Checkoutyourinboxtoconfirmyourinvite.TrendingArticlesEngineeringIconChevronWebFront-endAcingGoogle'sPageSpeedInsightsAssessmentEngineeringIconChevronDataScienceandDatabasesADeeperMeaning:TopicModelinginPythonEngineeringIconChevronDataScienceandDatabasesServeMapClusters50xFasterUsingSmarterCachingEngineeringIconChevronBack-endControlYourClimateWithThisRaspberryPiThermostatTutorialSeeourrelatedtalentJavaScriptOpenGLFreelancer?Findyournextjob.JavaScriptDeveloperJobsHiretheauthorViewfullprofileAdnanAdemovicFreelanceSoftwareEngineerReadNextEngineeringIconChevronBack-endAnIn-depthLookatC++vs.JavaWorld-classarticles,deliveredweekly.SignMeUpSubscriptionimpliesconsenttoourprivacypolicyThankyou!Checkoutyourinboxtoconfirmyourinvite.World-classarticles,deliveredweekly.SignMeUpSubscriptionimpliesconsenttoourprivacypolicyThankyou!Checkoutyourinboxtoconfirmyourinvite.ToptalDevelopersAlgorithmDevelopersAngularDevelopersAWSDevelopersAzureDevelopersBigDataArchitectsBlockchainDevelopersBusinessIntelligenceDevelopersCDevelopersComputerVisionDevelopersDjangoDevelopersDockerDevelopersElixirDevelopersGoEngineersGraphQLDevelopersJenkinsDevelopersKotlinDevelopersKubernetesExpertsMachineLearningEngineersMagentoDevelopers.NETDevelopersRDevelopersReactNativeDevelopersRubyonRailsDevelopersSalesforceDevelopersSQLDevelopersSysAdminsTableauDevelopersUnrealEngineDevelopersXamarinDevelopersViewMoreFreelanceDevelopersJointheToptal®community.
HireaDeveloperORApplyasaDeveloperBycontinuingtousethissiteyouagreetoourCookiePolicy.Gotit