Home
Search results “Spatial data mining wikipedia free”
How to easily use CART decision tree with GIS data in R environment?
 
23:21
To download the R code and GIS data, please visit: http://althuwaynee.blogspot.com.tr/2017/03/how-to-produce-classification-decision.html landslide analysis using gis landslide analysis and early warning systems landslide analysis and control landslide analysis software landslide analysis in geographic information systems landslide analysis using remote sensing landslide analysis using ArcGIS landslide limousine environmental analysis landslide analysis gis landslide hazard analysis benefits of landslide hazard analysis probabilistic landslide hazard analysis landslide analysis in gis analysis of landslide landslide modeling and risk analysis method of landslide stability analysis r statistics download r statistics definition r statistics cheat sheet r statistics correlation r statistics essential training r statistics wiki r statistics meaning r statistics r statistics software r statistical analysis r statistical analysis examples r statistical analysis tutorial r statistics alternative r statistics database r statistics decision tree r statistics descriptive r and gis data r and grass gis r statistics and gis r gis tutorial r gis package r gis tools r gis map r gis analysis r as gis arcgis r r gis centroid r cran gis r gis download r gis example r gis maptools r gis overlay r project gis r gis raster r gis spatial r statistics gis r studio gis r tree gis gis and r rstudio tutorial r studio update rstudio clear console rstudio help r studio r studio data frame r studio data sets r studio download package rstudio examples rstudio environment r studio free
Views: 1776 AlThuwaynee
What is VIDEO MOTION ANALYSIS? What does VIDEO MOTION ANALYSIS mean? VIDEO MOTION ANALYSIS meaning
 
04:41
What is VIDEO MOTION ANALYSIS? What does VIDEO MOTION ANALYSIS mean? VIDEO MOTION ANALYSIS meaning - VIDEO MOTION ANALYSIS definition - VIDEO MOTION ANALYSIS explanation. Source: Wikipedia.org article, adapted under https://creativecommons.org/licenses/by-sa/3.0/ license. SUBSCRIBE to our Google Earth flights channel - https://www.youtube.com/channel/UC6UuCPh7GrXznZi0Hz2YQnQ Video motion analysis is a technique used to get information about moving objects from video. Examples of this include gait analysis, sport replays, speed and acceleration calculations and, in the case of team or individual sports, task performance analysis. The motions analysis technique usually involves a high-speed camera and a computer that has software allowing frame-by-frame playback of the video. Traditionally, video motion analysis has been used in scientific circles for calculation of speeds of projectiles, or in sport for improving play of athletes. Recently, computer technology has allowed other applications of video motion analysis to surface including things like teaching fundamental laws of physics to school students, or general educational projects in sport and science. In sport, systems have been developed to provide a high level of task, performance and physiological data to coaches, teams and players. The objective is to improve individual and team performance and/or analyse opposition patterns of play to give tactical advantage. The repetitive and patterned nature of sports games lends itself to video analysis in that over a period of time real patterns, trends or habits can be discerned. Police and forensic scientists analyse CCTV video when investigating criminal activity. Police use software which performs video motion analysis to search for key events in video and find suspects. A digital video camera is mounted on a tripod. The moving object of interest is filmed doing a motion with a scale in clear view on the camera. Using video motion analysis software, the image on screen can be calibrated to the size of the scale enabling measurement of real world values. The software also takes note of the time between frames to give a movement versus time data set. This is useful in calculating gravity for instance from a dropping ball. Sophisticated sport analysis systems such as those by Verusco Technologies in New Zealand use other methods such as direct feeds from satellite television to provide real-time analysis to coaches over the Internet and more detailed post game analysis after the game has ended. There are many commercial packages that enable frame by frame or real-time video motion analysis. There are also free packages available that provide the necessary software functions. These free packages include the relatively old but still functional Physvis, and a relatively new program called PhysMo which runs on Macintosh and Windows. Upmygame is a free online application. VideoStrobe is free software that creates a strobographic image from a video; motion analysis can then be carried out with dynamic geometry software such as GeoGebra. The objective for video motion analysis will determine the type of software used. Prozone and Amisco are expensive stadium-based camera installations focusing on player movement and patterns. Both of these provide a service to "tag" or "code" the video with the players' actions, and deliver the results after the match. In each of these services, the data is tagged according to the company's standards for defining actions. Verusco Technologies are oriented more on task and performance and therefore can analyse games from any ground. Focus X2 and Sportscode systems rely on the team performing the analysis in house, allowing the results to be available immediately, and to the team's own coding standards. MatchMatix takes the data output of video analysis software and analyses sequences of events. Live HTML reports are generated and shared across a LAN, giving updates to the manager on the touchline while the game is in progress.
Views: 170 The Audiopedia
What is Lucene | Exploring Apache Lucene in depth | Apache Lucene Tutorial
 
01:18:39
( Apache Solr Certification Training - https://www.edureka.co/apache-solr-self-paced ) Watch the sample class recording: http://www.edureka.co/apache-solr?utm_source=youtube&utm_medium=referral&utm_campaign=what-is-lucene Apache Lucene is a free open source information retrieval software library, originally written in Java by Doug Cutting. It is supported by the Apache Software Foundation and is released under the Apache Software License.Lucene scoring is the heart of why we all love Lucene. It is blazingly fast and it hides almost all of the complexity from the user. In a nutshell,it works. At least, that is, until it doesn't work, or doesn't work as one would expect it to work. Then we are left digging into Lucene internals or asking for help on [email protected] to figure out why a document with five of our query terms scores lower than a different document with only one of the query terms. Lets go through this video to explore Lucene in depth. Related post: http://www.edureka.co/blog/apache-solr-shedding-some-light/?utm_source=youtube&utm_medium=referral&utm_campaign=what-is-lucene http://www.edureka.co/blog/solr30thoct?utm_source=youtube&utm_medium=referral&utm_campaign=what-is-lucene Edureka is a New Age e-learning platform that provides Instructor-Led Live, Online classes for learners who would prefer a hassle free and self paced learning environment, accessible from any part of the world. The topics related to ‘Apache Lucene' have been covered in our course ‘Apache Solr‘. For more information, please write back to us at [email protected] Call us at US: 1800 275 9730 (toll free) or India: +91-8880862004
Views: 10632 edureka!
International Journal of Data Mining & Knowledge Management Process (IJDKP)
 
00:36
International Journal of Data Mining & Knowledge Management Process ( IJDKP ) http://airccse.org/journal/ijdkp/ijdkp.html ISSN : 2230 - 9608[Online] ; 2231 - 007X [Print] Call for Papers Data mining and knowledge discovery in databases have been attracting a significant amount of research, industry, and media attention of late. There is an urgent need for a new generation of computational theories and tools to assist researchers in extracting useful information from the rapidly growing volumes of digital data. This Journal provides a forum for researchers who address this issue and to present their work in a peer-reviewed open access forum. Authors are solicited to contribute to the workshop by submitting articles that illustrate research results, projects, surveying works and industrial experiences that describe significant advances in the following areas, but are not limited to these topics only. Data Mining Foundations Parallel and Distributed Data Mining Algorithms, Data Streams Mining, Graph Mining, Spatial Data Mining, Text video, Multimedia Data Mining, Web Mining,Pre-Processing Techniques, Visualization, Security and Information Hiding in Data Mining Data Mining Applications Databases, Bioinformatics, Biometrics, Image Analysis, Financial Mmodeling, Forecasting, Classification, Clustering, Social Networks, Educational Data Mining Knowledge Processing Data and Knowledge Representation, Knowledge Discovery Framework and Process, Including Pre- and Post-Processing, Integration of Data Warehousing, OLAP and Data Mining, Integrating Constraints and Knowledge in the KDD Process , Exploring Data Analysis, Inference of Causes, Prediction, Evaluating, Consolidating and Explaining Discovered Knowledge, Statistical Techniques for Generation a Robust, Consistent Data Model, Interactive Data Exploration Visualization and Discovery, Languages and Interfaces for Data Mining, Mining Trends, Opportunities and Risks, Mining from Low-Quality Information Sources Paper submission Authors are invited to submit papers for this journal through e-mail [email protected] . Submissions must be original and should not have been published previously or be under consideration for publication while being evaluated for this Journal.
Views: 175 ijdkp jou
Shooting Down a Lost Drone and why Dogs Tilt their Heads - Smarter Every Day 173
 
09:28
Click to subscribe to the Sound Traveler: http://bit.ly/Sub2TheSoundTraveler Get a free audio book! http://www.audible.com/Smarter Click here if you're interested in subscribing to SED: http://bit.ly/Subscribe2SED ⇊ Click below for more links! ⇊ The Sound Traveler: https://www.youtube.com/thesoundtraveler Check out Viviane’s Channel, Scilabus: https://www.youtube.com/user/scilabus Pupper videos were provided by Patrons of Smarter Every Day on Patreon! http://www.patreon.com/smartereveryday ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ GET SMARTER SECTION https://en.wikipedia.org/wiki/Sound_localization Temporal Cues: https://digital.lib.washington.edu/researchworks/bitstream/handle/1773/21842/Brown_washington_0250E_10994.pdf?sequence=1&isAllowed=y Spectral Cues http://www.ee.usyd.edu.au/carlab/CARlabPublicationsData/PDF/flspec99-2069244168/flspec99.pdf Spectral Cues are an issue for hearing aids: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2610393/pdf/nihms66642.pdf https://books.google.com.co/books?hl=en&lr=&id=aBrrCAAAQBAJ&oi=fnd&pg=PA449&dq=dogs+sound+localization+head+tilt&ots=1z3k0frepd&sig=bd6NgJ-mPDRqwHsFH5tU_ioYtk0#v=onepage&q&f=false Dog Head Tilt: https://vcahospitals.com/know-your-pet/why-dogs-tilt-their-heads ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Tweet Ideas to me at: http://twitter.com/smartereveryday I'm "ilikerockets" on Snapchat. Snap Code: http://i.imgur.com/7DGfEpR.png Smarter Every Day on Facebook https://www.facebook.com/SmarterEveryDay Smarter Every Day on Patreon http://www.patreon.com/smartereveryday Smarter Every Day On Instagram http://www.instagram.com/smartereveryday Smarter Every Day SubReddit http://www.reddit.com/r/smartereveryday Ambiance and musicy things by: Gordon McGladdery did the outro music the video. http://ashellinthepit.bandcamp.com/ The thought is it my efforts making videos will help educate the world as a whole, and one day generate enough revenue to pay for my kids college education. Until then if you appreciate what you've learned in this video and the effort that went in to it, please SHARE THE VIDEO! If you REALLY liked it, feel free to pitch a few dollars Smarter Every Day by becoming a Patron. http://www.patreon.com/smartereveryday
Views: 1974461 SmarterEveryDay
OSM Data Retrieving, Extracting Layers & Area Calculation using QGIS
 
09:30
This video shows how to download Open Street Map Data using JOSM and extract specific layers using "Extract by attribute" in QGIS. Areas of polygon were calculated using Field Calculator with an expression of $area. Comment below if you'd like to know more about QGIS. Download JOSM at: https://josm.openstreetmap.de/wiki/Download josm-tested.jar QGIS at: http://www.qgis.org/en/site/forusers/download.html Map Features for extracting layers: http://wiki.openstreetmap.org/wiki/Map_Features Follow me on Spotify and my playlists :D https://open.spotify.com/user/12174876450 Music: Beat of My Drum by Powers Electric Love by BØRNS Our Own House - MisterWives
Views: 866 Lj Rapi
SpatialHadoop: MapReduce Processing of Spatial Data in Hadoop
 
25:23
SpatialHadoop is an open source MapReduce framework with built-in support for spatial data. It employs the MapReduce programming paradigm for distributed processing to build a general purpose tool for large scale analysis of spatial data on large clusters. Users can interact easily with SpatialHadoop through a high level language with built-in support for spatial data types and spatial operations. Existing spatial data sets can be loaded in SpatialHadoop with the built in spatial data types point, polygon and rectangle. In addition, the data sets are stored efficiently using built-in indexes (Grid file or R-tree) which speed up the retrieval and processing of these data sets. Users can build an index of their choice with a single command that runs in parallel on the machines in the cluster. Once the index is built, users can start analyzing their data sets using the built in spatial operations (range query, k nearest neighbor and spatial join).
Views: Andrea Ross
What is XYO Crypto? - Crypto-location Services to Track You on the Blockchain?
 
05:11
🚀 Get the Apps! ★ http://cryptoyum.com ★ http://coinpuffs.com 10 Days of Bitcoin: 💯 Free Email Course! ★ http://10daysofbitcoin.com Will XYO Cryptocurrency track you as you meander around the blockchain? // GET STARTED 🚀 Become a Cryptonaut - Support us on http://patreon.com/pub 💻 Join us at the PUB! - http://thebitcoin.pub 💰Get a Coinbase Wallet! - http://dctv.co/dctv-coinbase - Sign up! // WE DO SOCIAL 🔑 Decentralized Newsletter - https://dctv.co/dctv-news 📔 Twitter - https://dctv.co/dctv-twitter 📔 Facebook - https://dctv.co/dctv-fb 🔑 Instagram - https://dctv.co/dctv-instagram 💻 Google+ - https://dctv.co/dctv-googleplus ✏️ LinkedIn - https://dctv.co/dctv-linkedin 💻 Medium - https://dctv.co/dctv-medium Music by Charles Giovanniello, a Bitcoin Pub community member! Note: This is not financial advice as all investing is speculative. Have fun and good luck!
Views: 31581 Decentralized TV
GeoLocation Equals DataMining: Location-Aware Services DataMine Your Information
 
06:50
Beware of all these Location Apps that just DataMine your information and sell it to the Government and Marketers. If you do the research, you will find that most of these services actually ARE the government in disguise. Turn their disguise into your disgust and stop using these Government Database services! The same goes for all Twitter Apps, Email, and any other service that allows Free use in exchange for your Data. The Twit Wiki states: "It is perfectly acceptable to put any Twit Live clip on YouTube."
Views: 291 TechAndNews
Floyd Toole - Sound reproduction – art and science/opinions and facts
 
01:13:57
CIRMMT Distinguished Lectures in the Science and Technology of Music Floyd Toole, consultant to Harman International, USA 16 April 2015 - Tanna Schulich Hall http://www.cirmmt.org/activities/distinguished-lectures https://www.facebook.com/CIRMMT/ APA video citation: Toole, F. (2015, May 14). Sound reproduction - art and science/opinions and facts - CIRMMT Distinguished Lectures in the Science and Technology of Music. [Video file]. Retrieved from https://www.youtube.com/watch?v=zrpUDuUtxPM
Views: 51096 CIRMMT
What Is A Lineament Geology?
 
00:45
Lineament interpretation with auxiliary data giving information about geology and the precambrian western core area, known geologically as a shield or craton, is subdivided by long, straight (or only slightly bowed) fractures called lineaments lineament using object based image analysis approach results of semi automated analyses versus visual. Tectonic interpretation of topographic lineaments in the seacoast lineament analysis and inference geologic structure aapg definition by free dictionary. Lineament extraction. Lineament wikipedia lineament wikipedia en. Although some geologists understand the connection between basement faults and (1) joints, (2) linears, (3) lineaments, many do lineament mapping using remote sensing techniques structural geology for carbon dioxide sequestration site characterization in central new york definition, often, lineaments. Wikipedia wiki lineament url? Q webcache. A feature or detail of a face, body, figure, geology. The lineament reflects the geological structure such as faults or fractures 27 apr 2015 lineaments are extractable linear features from satellite and aerial images, which some how correlated with structures lineaments, zones of contiguous nearly parallellineaments, areas homogeneous publicationauthorizedby thedirector, bureauofeconomic geology, may be made when producing thematic maps. Remote sensing for geology, geomorphology and lineament study of analysis south jenein area (southern tunisia) using mapping lineaments groundwater targeting sustainable topographic their geological significance in central characterization different types from tm 5 landsat. Geological survey of lineament mapping and analyses have gained popularity with the increasing geologic using slar imagery puerto rico results in a map that 30 nov 2012 abstract. What is your experience with lineaments in geological and lineament analysis inference of geologic structure examples interpretation short review methodologygeological using the object based image lithology derived from a slar. Joints, linears, and lineaments the search discovery. The topographic lineament map and the high resolution bathymetry assisted in transverse lineaments, which trend almost perpendicular to these fault zones, mark 2bureau of economic geology, university texas, austin, texas 78712 define. A lineament is a linear feature in landscape which an expression of underlying geological structure such as fault. A linear topographic feature of regional extent that is believed to and the most prominent geologic structures are easily visible on these maps. Googleusercontent search. Lineament synonyms, lineament pronunciation, (physical geography) geology any long natural feature on the surface of earth, remote sensing for geology, geomorphology and study vindhyan basin, north shivpuri, madhya pradeshauthors affiliations accurate geological mapping is a critical task structural analysis tectonic interpretation in stable platform domain. Typically a lineament wi
Views: 812 Robert Robert
Geo-information science | Wikipedia audio article
 
59:50
This is an audio version of the Wikipedia Article: https://en.wikipedia.org/wiki/Geographic_information_system 00:01:44 1 History of development 00:08:57 2 Techniques and technology 00:09:57 2.1 Relating information from different sources 00:12:01 2.2 GIS uncertainties 00:14:00 2.3 Data representation 00:15:19 2.4 Data capture 00:20:09 2.5 Raster-to-vector translation 00:21:31 2.6 Projections, coordinate systems, and registration 00:22:21 3 Spatial analysis with geographical information system (GIS) 00:23:49 3.1 Slope and aspect 00:27:11 3.2 Data analysis 00:29:05 3.3 Topological modeling 00:29:42 3.4 Geometric networks 00:30:38 3.5 Hydrological modeling 00:32:10 3.6 Cartographic modeling 00:32:59 3.7 Map overlay 00:34:41 3.8 Geostatistics 00:37:16 3.9 Address geocoding 00:38:36 3.10 Reverse geocoding 00:39:26 3.11 Multi-criteria decision analysis 00:40:13 3.12 Data output and cartography 00:41:41 3.13 Graphic display techniques 00:43:40 3.14 Spatial ETL 00:44:21 3.15 GIS data mining 00:45:04 4 Applications 00:47:04 4.1 Open Geospatial Consortium standards 00:48:54 4.2 Web mapping 00:50:08 4.3 Adding the dimension of time 00:52:38 5 Semantics 00:55:00 6 Implications of GIS in society 00:55:55 6.1 GIS in education 00:56:54 6.2 GIS in local government Listening is a more natural way of learning, when compared to reading. Written language only began at around 3200 BC, but spoken language has existed long ago. Learning by listening is a great way to: - increases imagination and understanding - improves your listening skills - improves your own spoken accent - learn while on the move - reduce eye strain Now learn the vast amount of general knowledge available on Wikipedia through audio (audio article). You could even learn subconsciously by playing the audio while you are sleeping! If you are planning to listen a lot, you could try using a bone conduction headphone, or a standard speaker instead of an earphone. Listen on Google Assistant through Extra Audio: https://assistant.google.com/services/invoke/uid/0000001a130b3f91 Other Wikipedia audio articles at: https://www.youtube.com/results?search_query=wikipedia+tts Upload your own Wikipedia articles through: https://github.com/nodef/wikipedia-tts Speaking Rate: 0.914666102936741 Voice name: en-US-Wavenet-E "I cannot teach anybody anything, I can only make them think." - Socrates SUMMARY ======= A geographic information system (GIS) is a system designed to capture, store, manipulate, analyze, manage, and present spatial or geographic data. GIS applications are tools that allow users to create interactive queries (user-created searches), analyze spatial information, edit data in maps, and present the results of all these operations. GIS (more commonly GIScience) sometimes refers to geographic information science (GIScience), the science underlying geographic concepts, applications, and systems.GIS can refer to a number of different technologies, processes, techniques and methods. It is attached to many operations and has many applications related to engineering, planning, management, transport/logistics, insurance, telecommunications, and business. For that reason, GIS and location intelligence applications can be the foundation for many location-enabled services that rely on analysis and visualization. GIS can relate unrelated information by using location as the key index variable. Locations or extents in the Earth space–time may be recorded as dates/times of occurrence, and x, y, and z coordinates representing, longitude, latitude, and elevation, respectively. All Earth-based spatial–temporal location and extent references should be relatable to one another and ultimately to a "real" physical location or extent. This key characteristic of GIS has begun to open new avenues of scientific inquiry.
Views: 3 wikipedia tts
11. Introduction to Machine Learning
 
51:31
MIT 6.0002 Introduction to Computational Thinking and Data Science, Fall 2016 View the complete course: http://ocw.mit.edu/6-0002F16 Instructor: Eric Grimson In this lecture, Prof. Grimson introduces machine learning and shows examples of supervised learning using feature vectors. License: Creative Commons BY-NC-SA More information at http://ocw.mit.edu/terms More courses at http://ocw.mit.edu
Views: 461075 MIT OpenCourseWare
Mass spectrometry part 1 : introduction
 
24:09
For more information, log on to- http://shomusbiology.com/ Download the study materials here- http://shomusbiology.weebly.com/bio-materials.html Mass spectrometry (MS) is an analytical technique that produces spectra (singular spectrum) of the masses of the molecules comprising a sample of material. The spectra are used to determine the elemental composition of a sample, the masses of particles and of molecules, and to elucidate the chemical structures of molecules, such as peptides and other chemical compounds. Mass spectrometry works by ionizing chemical compounds to generate charged molecules or molecule fragments and measuring their mass-to-charge ratios.[1] In a typical MS procedure, a sample, which may be solid, liquid, or gas, is ionized. The ions are separated according to their mass-to-charge ratio.[1] The ions are detected by a mechanism capable of detecting charged particles. Signal processing results are displayed as spectra of the relative abundance of ions as a function of the mass-to-charge ratio. The atoms or molecules can be identified by correlating known masses to the identified masses or through a characteristic fragmentation pattern. A mass spectrometer consists of three components: an ion source, a mass analyzer, and a detector.[2] The ionizer converts a portion of the sample into ions. There is a wide variety of ionization techniques, depending on the phase (solid, liquid, gas) of the sample and the efficiency of various ionization mechanisms for the unknown species. An extraction system removes ions from the sample, which are then trajected through the mass analyzer and onto the detector. The differences in masses of the fragments allows the mass analyzer to sort the ions by their mass-to-charge ratio. The detector measures the value of an indicator quantity and thus provides data for calculating the abundances of each ion present. Some detectors also give spatial information, e.g. a multichannel plate. Mass spectrometry has both qualitative and quantitative uses. These include identifying unknown compounds, determining the isotopic composition of elements in a molecule, and determining the structure of a compound by observing its fragmentation. Other uses include quantifying the amount of a compound in a sample or studying the fundamentals of gas phase ion chemistry (the chemistry of ions and neutrals in a vacuum). MS is now in very common use in analytical laboratories that study physical, chemical, or biological properties of a great variety of compounds. As an analytical technique it possesses distinct advantages such as: 1. Increased sensitivity over most other analytical techniques because the analyzer, as a mass-charge filter, reduces background interference 2. Excellent specificity from characteristic fragmentation patterns to identify unknowns or confirm the presence of suspected compounds. 3. Information about molecular weight. 4. Information about the isotopic abundance of elements. 5. Temporally resolved chemical data. A few of the disadvantages of the method is that often fails to distinguish between optical and geometrical isomers and the positions of substituent in o-, m- and p- positions in an aromatic ring. Also, its scope is limited in identifying hydrocarbons that produce similar fragmented ions.[3] Source of the article published in description is Wikipedia. I am sharing their material. Copyright by original content developers of Wikipedia. Link- http://en.wikipedia.org/wiki/Main_Page
Views: 232836 Shomu's Biology
Mining The European Library: Spatial and Temporal Facetting
 
01:39
This video describes the spatial and temporal mining possibilities to extract relevant resources about Pope Pius.
Views: 790 theeuropeanlibrary
Rotational Motion
 
10:40
053 - Rotational Motion In this video Paul Andersen explains how a net torque acting on an object will create rotational motion. This motion can be described by the angular displacement, angular velocity, and angular acceleration. The linear velocity can be calculated by determining the distance from the axis of rotation. The net torque is equal to the product of the rotational inertia and the angular acceleration. Do you speak another language? Help me translate my videos: http://www.bozemanscience.com/translations/ Music Attribution Title: String Theory Artist: Herman Jolly http://sunsetvalley.bandcamp.com/track/string-theory All of the images are licensed under creative commons and public domain licensing: “Moment of Inertia.” Wikipedia, the Free Encyclopedia, September 7, 2014. http://en.wikipedia.org/w/index.php?title=Moment_of_inertia&oldid=623976736. “Radian.” Wikipedia, the Free Encyclopedia, September 6, 2014. http://en.wikipedia.org/w/index.php?title=Radian&oldid=614093713.
Views: 192346 Bozeman Science
Space Shuttle STS-99 Endeavour Shuttle Radar Topography Mission (SRTM) 2000 NASA 15min
 
14:53
more at http://scitech.quickfound.net/astro/space_shuttle_news.html 'STS-99 POST FLIGHT PRESENTATION JSC1819 - (2000) - 15 Minutes - Commander: Kevin Kregel Pilot: Dominic L. Pudwill Gorie Mission Specialists: Gerhard P.J. Thiele, Janet Kavandi, Janice Voss, Mamoru Mohri Dates: February 11-22, 2000 Vehicle: Endeavour OV-105 Payloads: STRM and EarthKam Landing Site: Runway 33 at Kennedy Space Center, FL' NASA film JSC-1819 Public domain film slightly cropped to remove uneven edges, with the aspect ratio corrected, and mild video noise reduction applied. The soundtrack was also processed with volume normalization, noise reduction, clipping reduction, and equalization. http://en.wikipedia.org/wiki/STS-99 STS-99 was a Space Shuttle Endeavour mission, that launched on 11 February 2000 from Kennedy Space Center, Florida. The primary objective of the mission was the Shuttle Radar Topography Mission (SRTM) project. The Shuttle Radar Topography Mission (SRTM) is an international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR. Its objective is to obtain the most complete high-resolution digital topographic database of the Earth. SRTM consists of a specially modified radar system that flew on board the space shuttle during its 11-day mission. This radar system gathered around 8 terabytes of data to produce unrivaled 3-D images of the Earth's surface. SRTM uses C-band and X-band interferometric synthetic aperture radar (IFSAR) to acquire topographic data of Earth's land mass (between 60°N and 56°S). It produces digital topographic map products which meet Interferometric Terrain Height Data (ITHD)-2 specifications (30 meter x 30 meter spatial sampling with 16 meter absolute vertical height accuracy, 10 meter relative vertical height accuracy and 20 meter absolute horizontal circular accuracy). The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. The Shuttle Radar Topography Mission mast was deployed successfully to its full length, and the antenna was turned to its operation position. After a successful checkout of the radar systems, mapping began at 00:31 EST, less than 12 hours after launch. Crewmembers, split into two shifts so they could work around the clock, began mapping an area from 60 degrees north to 56 degrees south. Data was sent to Jet Propulsion Laboratory for analysis and early indications showed the data to be of excellent quality... Radar data gathering concluded at 06:54 EST on the tenth day of flight after a final sweep across Australia. During 222 hours and 23 minutes of mapping, Endeavour's radar images filled 332 high density tapes and covered 99.98 % of the planned mapping area -- land between 60 degrees north latitude and 56 degrees south latitude -- at least once and 94.6 % of it twice. Only about 80,000 square miles (210,000 km2) in scattered areas remained unimaged, most of them in North America and most already well mapped by other methods. Enough data was gathered to fill the equivalent of 20,000 CDs. Also aboard Endeavour was a student experiment called EarthKAM, which took 2,715 digital photos during the mission through an overhead flight-deck window... Endeavour also saw the recommissioning of the Spacelab Pallet system, used for experiments in vacuum. The 2007 Smithsonian Networks documentary Oasis Earth was made about the mission... This was the last mission to fly with the standard cockpit in 18 straight years. A glass cockpit was first used after this mission.(STS-101) This was also the last solo flight of Space Shuttle Endeavour. All further launches for Endeavour became International Space Station missions.
Views: 10649 Jeff Quitney
Document Classification with Neo4j
 
39:35
Graphs are a perfect solution to organize information and to determine the relatedness of content. In this webinar, Neo4j Developer Evangelist Kenny Bastani will discuss using Neo4j to perform document classification. He will demonstrate how to build a scalable architecture for classifying natural language text using a graph-based algorithm called Hierarchical Pattern Recognition. This approach encompasses a set of techniques familiar to Deep Learning practitioners. Kenny will then introduce a new Neo4j unmanaged extension that can train natural language models on Wikipedia articles to determine which articles are most related based on a vector of shared features. Speaker: Kenny Bastani, Developer Evangelist, Neo Technology Kenny Bastani is an accomplished software development consultant and entrepreneur with 10+ years of industry experience as a front-end and back-end engineer. Kenny has demonstrated leadership in designing and developing enterprise-grade web applications for high-volume, high-availability environments, with innovative focuses on solving unsupervised machine learning problems that enable businesses to better manage their institutional memory. As both an entrepreneur and software designer based in the SF Bay Area, Kenny has gained valuable experience leading teams in both product design and software architecture.
Views: 7813 Neo4j
The Case for Small Data Management
 
42:08
Abstract: Exabytes of data; several hundred thousand TPC-C transactions per second on a single computing core; scale-up to hundreds of cores and a dozen Terabytes of main memory; scale-out to thousands of nodes with close to Petabyte-sized main memories; and massively parallel query processing are a reality in data management. But, hold on a second: for how many users exactly? How many users do you know that really have to handle these kinds of massive datasets and extreme query workloads? On the other hand: how many users do you know that are fighting to handle relatively small datasets, say in the range of a few thousand to a few million rows per table? How come some of the most popular open source DBMS have hopelessly outdated optimizers producing inefficient query plans? How come people don’t care and love it anyway? Could it be that most of the world’s data management problems are actually quite small? How can we increase the impact of database research in areas when datasets are small? What are the typical problems? What does this mean for database research? We discuss research challenges, directions, and a concrete technical solution coined PDbF: Portable Database Files. This is an extended version of an abstract and Gong Show talk presented at CIDR 2015. This talk was held on March 6, 2015 at the German Database Conference BTW in Hamburg. http://www.btw-2015.de/?keynote_dittrich Short CV: Jens Dittrich is a Full Professor of Computer Science in the area of Databases, Data Management, and "Big Data" at Saarland University, Germany. Previous affiliations include U Marburg, SAP AG, and ETH Zurich. He is also associated to CISPA (Center for IT-Security, Privacy and Accountability). He received an Outrageous Ideas and Vision Paper Award at CIDR 2011, a BMBF VIP Grant, a best paper award at VLDB 2014, two CS teaching awards in 2011 and 2013, as well as several presentation awards including a qualification for the interdisciplinary German science slam finals in 2012 and three presentation awards at CIDR (2011, 2013, and 2015). His research focuses on fast access to big data including in particular: data analytics on large datasets, Hadoop MapReduce, main-memory databases, and database indexing. He has been a PC member and/or area chair of prestigious international database conferences such as PVLDB, SIGMOD, and ICDE. Since 2013 he has been teaching his classes on data management as flipped classrooms. See http://datenbankenlernen.de or http://youtube.com/jensdit for a list of freely available videos on database technology in German and English (about 80 videos in German and 80 in English so far). image credits: public domain http://commons.wikimedia.org/wiki/File:The_Blue_Marble.jpg CC, Laura Poitras / Praxis Films http://commons.wikimedia.org/wiki/File:Edward_Snowden-2.jpg http://creativecommons.org/licenses/by/3.0/legalcode istock, voyager624 http://www.istockphoto.com/stock-photo-20540898-blue-digital-tunnel.php?st=0d10b3d http://commons.wikimedia.org/wiki/Category:Egg_sandwich?uselang=de#mediaviewer/File:Sandwich_Huevo_-_Ventana.JPG http://creativecommons.org/licenses/by/3.0/legalcode زرشک CC BY-SA 3.0, http://creativecommons.org/licenses/by-sa/3.0/legalcode public domain, http://en.wikipedia.org/wiki/Tanker_%28ship%29#mediaviewer/File:Sirius_Star_2008b.jpg public domain, http://de.wikipedia.org/wiki/General_Dynamics_F-16#mediaviewer/File:General_Dynamic_F-16_USAF.jpg ©iStock.com: skynesher public domain, http://commons.wikimedia.org/wiki/File:Astronaut-EVA.jpg others: Jens Dittrich, http://datenbankenlernen.de
Ville Pulkki: "Developing spatial sound techniques for human listeners"
 
19:57
Aalto University Tenured Professors' Installation Lectures Jan. 19 2016. "Developing spatial sound techniques for human listeners" Ville Pulkki Department of Signal Processing and Acoustics School of Electrical Engineering http://www.aalto.fi/en/about/careers/tenure_track/ Video by Teekkarien elokuvakerho Montaasi Ry Production Aalto University Communications 2016
Views: 902 Aalto University
Weka Data Mining Tutorial for First Time & Beginner Users
 
23:09
23-minute beginner-friendly introduction to data mining with WEKA. Examples of algorithms to get you started with WEKA: logistic regression, decision tree, neural network and support vector machine. Update 7/20/2018: I put data files in .ARFF here http://pastebin.com/Ea55rc3j and in .CSV here http://pastebin.com/4sG90tTu Sorry uploading the data file took so long...it was on an old laptop.
Views: 449448 Brandon Weinberg
First Landsat: "Earth Resources Technology Satellite" (ERTS) 1973 NASA
 
27:32
NASA & Space Miscellany playlist: https://www.youtube.com/playlist?list=PL_hX5wLdhf_K3mK1TZNCkmdD-JMZYGew1 more at http://scitech.quickfound.net/environment/environment_news.html "National Aeronautics and Space Administration This film illustrates how the Earth Resources Technology Satellite (ERTS) helped to meet the need for a worldwide survey of Earth resources in order to assist scientists and governments plan their use and conservation." Produced for NASA by Audio Productions. NASA film HQ-223 Reupload of a previously uploaded film with improved video & sound. Public domain film from the US National Archives, slightly cropped to remove uneven edges, with the aspect ratio corrected, and one-pass brightness-contrast-color correction & mild video noise reduction applied. The soundtrack was also processed with volume normalization, noise reduction, clipping reduction, and/or equalization (the resulting sound, though not perfect, is far less noisy than the original). http://creativecommons.org/licenses/by-sa/3.0/ http://en.wikipedia.org/wiki/Landsat_1 Landsat 1, originally named "Earth Resources Technology Satellite 1", was the first satellite of the United States' Landsat program. It was a modified version of the Nimbus 4 meteorological satellite and was launched on July 23, 1972 by a Delta 900 rocket from Vandenberg Air Force Base in California. The near-polar orbiting spacecraft served as a stabilized, Earth-oriented platform for obtaining information on agricultural and forestry resources, geology and mineral resources, hydrology and water resources, geography, cartography, environmental pollution, oceanography and marine resources, and meteorological phenomena. To accomplish these objectives, the spacecraft was equipped with: - a three-camera return-beam vidicon (RBV) to obtain visible light and near infrared photographic images of Earth; - a four-channel multispectral scanner (MSS) to obtain radiometric images of Earth; - a data collection system (DCS) to collect information from remote, individually equipped ground stations and to relay the data to central acquisition stations. The satellite also carried two wide-band video tape recorders (WBVTR) capable of storing up to 30 minutes of scanner or camera data, giving the spacecraft's sensors a near-global coverage capability. An advanced attitude control system consisting of horizon scanners, sun sensors, and a command antenna combined with a freon gas propulsion system permitted the spacecraft's orientation to be maintained within plus or minus 0.7 degrees in all three axes. Spacecraft communications included a command subsystem operating at 154.2 and 2106.4 MHz and a PCM narrow-band telemetry subsystem, operating at 2287.5 and 137.86 MHz, for spacecraft housekeeping, attitude, and sensor performance data. Video data from the three-camera RBV system was transmitted in both real-time and tape recorder modes at 2265.5 MHz, while information from the MSS was constrained to a 20 MHz radio-frequency bandwidth at 2229.5 MHz. In 1976, Landsat 1 discovered a tiny uninhabited island 20 kilometers off the eastern coast of Canada. This island was thereafter designated Landsat Island after the satellite. As of 2006, it is the only island to be discovered via satellite imagery. The spacecraft was turned off on January 6, 1978, when cumulative precession of the orbital plane caused the spacecraft to become overheated due to near-constant exposure to sunlight. http://en.wikipedia.org/wiki/Landsat_program The Landsat program is the longest running enterprise for acquisition of satellite imagery of Earth. On July 26, 1972 the Earth Resources Technology Satellite was launched. This was eventually renamed to Landsat. The most recent, Landsat 7, was launched on April 15, 1999. The instruments on the Landsat satellites have acquired millions of images. The images, archived in the United States and at Landsat receiving stations around the world, are a unique resource for global change research and applications in agriculture, cartography, geology, forestry, regional planning, surveillance, education and national security. Landsat 7 data has eight spectral bands with spatial resolutions ranging from 15 to 60 meters; the temporal resolution is 16 days. Hughes Santa Barbara Research Center initiated design and fabrication of the first three MSS Multispectral Scanners... The first prototype MSS was completed within nine months by fall of 1970 when it was tested by scanning Half Dome at Yosemite National Park. ...In 1979, Presidential Directive 54 under President of the United States Jimmy Carter transferred Landsat operations from NASA to NOAA... and recommended transition to private sector operation of Landsat. This occurred in 1985 when the Earth Observation Satellite Company (EOSAT), a partnership of Hughes Aircraft and RCA, was selected by NOAA...
Views: 2536 Jeff Quitney
Isolation - Mind Field (Ep 1)
 
34:46
What happens when your brain is deprived of stimulation? What effect does being cut off from interaction with the outside world have on a person? What effect does it have on me, when I am locked in a windowless, soundproof isolation chamber for three days? In this episode of Mind Field, I take both an objective and a very intimate look at Isolation. Available with YouTube Premium - https://www.youtube.com/premium/originals. To see if Premium is available in your country, click here: https://goo.gl/A3HtfP
Views: 20634733 Vsauce
Heatmap overview of large datasets (Argo)
 
00:55
See https://github.com/geonetwork/core-geonetwork/wiki/WFS-Filters-based-on-WFS-indexing-with-SOLR for details
Views: 125 Francois Prunayre
What is DATA PREPARATION? What does DATA PREPARATION mean? DATA PREPARATION meaning & explanation
 
03:51
What is DATA PREPARATION? What does DATA PREPARATION mean? DATA PREPARATION meaning - DATA PREPARATION definition - DATA PREPARATION explanation. Source: Wikipedia.org article, adapted under https://creativecommons.org/licenses/by-sa/3.0/ license. SUBSCRIBE to our Google Earth flights channel - https://www.youtube.com/channel/UC6UuCPh7GrXznZi0Hz2YQnQ Data Preparation is the act of preparing (or pre-processing) "raw" data or disparate data sources into refined information assets that can be used effectively for various business purposes such as analysis. Data Preparation is a necessary, but often tedious, activity that is a critical first step in data analytics projects for Data wrangling. Data Preparation can include many discrete tasks such as loading data or data ingestion, data fusion, data cleansing, data augmentation and data delivery (writing out the prepared data to databases, file systems or applications). Data Cleansing is one of the most common tasks in Data Preparation. Common data cleansing activities involve ensuring the data is: Valid – falls within required constraints (e.g. data has correct data type), matches required patterns (e.g. phone numbers look like phone numbers), no cross-field issues (e.g. the state/province field only has valid values for the specific country in a Country field) Complete – ensuring all necessary data is available and where possibly, looking up needed data from external sources (e.g. finding the Zip/Postal code of an address via an external data source) Consistent – eliminating contradictions in the data (e.g. correcting the fact that the same individual may have different birthdates in different records or datasets) Uniform – ensuring common data elements follow common standards in the data, (e.g. uniform data/time formats across fields, uniform units of measure for weights, lengths) Accurate – where possible ensuring data is verifiable with an authoritative source (e.g. business information is referenced against a D&B database to ensure accuracy) Given the variety of data sources (e.g.,) that provide data and formats that data can arrive in, data preparation can be quite involved and complex. There are many tools and technologies that are used for data preparation. Traditional tools and technologies, such as scripting languages or ETL and Data Quality tools are not meant for business users. They typically require programming or IT skills that most business users don’t have. A number of startups such as Trifacta, Paxata and Alteryx have created software that is intended to help business users with little or no programming background efficiently perform data preparation. These products typically provide a visual interface that displays the data and allows the user to directly explore, structure, clean, augment and update the data as needed. The software often automatically analyses the data, providing the user with profiles and statistics on the data’s content, as well as semantic and machine learning algorithms that assist the user in making decisions on how to change the data for their needs. Once the preparation work is complete, the preparation steps can be used to generate reusable recipes that can be run on other datasets to perform the same operations. This code generation and reuse provides a significant productivity boost when compared to more traditional manual and hand-coding methods for data preparation.
Views: 219 The Audiopedia
metadata extraction using ExifTool on ubuntu 17.04
 
03:55
ExifTool is a free and open-source software program for reading, writing, and manipulating image, audio, video, and PDF metadata. It is platform independent, available as both a Perl library (Image::ExifTool) and command-line application. ExifTool is commonly incorporated into different types of digital workflows and supports many types of metadata including Exif, IPTC, XMP, JFIF, GeoTIFF, ICC Profile, Photoshop IRB, FlashPix, AFCP and ID3, as well as the manufacturer-specific metadata formats of many digital cameras. source:-https://en.wikipedia.org/wiki/ExifTool -------------------------------------------------------------------------------------------------- command:- sudo apt-get install libimage-exiftool-perl --------------------------------------------------------------------------------------------------
Views: 866 Tech ind
Lucene  Indexing Tutorial | Solr Indexing Tutorial | Search Engine Indexing | Solr Tutorial |Edureka
 
16:32
( Apache Solr Certification Training - https://www.edureka.co/apache-solr-self-paced ) Watch the sample class recording: http://www.edureka.co/apache-solr?utm_source=youtube&utm_medium=referral&utm_campaign=understanding-lucene-indexing Indexing is the process of creating indexes for record collections. Having indexes allows researchers to more quickly find records for specific individuals; without them, researchers might have to look through hundreds or thousands of records to locate an individual record. The topics covered in the video : 1. Need for Search Engines 2.Why Indexing 3.Indexing Flow 4.Lucene : Writing to Index 5.Lucene : Searching in Index 6.Lucene : Inverted Indexing Technique 7.Lucene : Storage Schema Related post: http://www.edureka.co/blog/apache-solr-shedding-some-light/?utm_source=youtube&utm_medium=referral&utm_campaign=understanding-solr-indexing Edureka is a New Age e-learning platform that provides Instructor-Led Live, Online classes for learners who would prefer a hassle free and self paced learning environment, accessible from any part of the world. The topics related to 'Understanding Solr Indexing' have been covered in our course ‘Apache Solr‘. For more information, please write back to us at [email protected] Call us at US: 1800 275 9730 (toll free) or India: +91-8880862004
Views: 29492 edureka!
Manual/Automatic classification and segmentation
 
05:53
Manual/Automatic classification and automatic segmentation for small photogrammetric datasets. Goal: extracting rocks from the ambiant background (ground) and segment them so that you can export individual point clouds for further processing. Methodology: Sofwtare = CloudCompare Beta 2.8 (http://www.danielgm.net/cc/release/ it has to be this one because the CSF filter is not included in previous version) 1. Manual classification with heightmap. The easiest way to classify your data if you have a highly contrasted and flat dataset (which is almost never) - Clone you PCL (pointcloud) to keep the original RGB information somewhere (if relevant) - Compute the heightmap as RGB - Convert the RGB values as Scalar Fields - Pick the relevant classification values with the Scalar Field histogram - Proceed with "select by values" to extract the relevant part of your data - Start again if you need multiple classification parameters - Clear the rgb colors from each extracted PCL and transfer the RGB values from the cloned PCL (if relevant again) 2. Automatic classification with CSF plugin (see CloudCompare documentation for more informations http://www.cloudcompare.org/doc/wiki/index.php?title=CSF_(plugin) ) A more robust alternative - It does not work with very small datasets (here around 4m²) so we have to scale up the PCL to trick the plugin into thinking it's a relatively big area - Still, I recommend using the finest settings to get good results with this very example - In the end, you get two PCL with extracted features 3. Automatic segmentation - If your extracted features which are somehow isolated one from another, you can run the segmentation tool (Tools - Segmentation - Label Conncted Comp) - You get in return a list of each feature as a separate PCL ranked in descending order of volume - The point here was very specific because we need to export each feature separatly to run surface and volume analysis in another software.
Views: 8539 nazg
What is DATABASE TUNING? What does DATABASE TUNING mean? DATABASE TUNING meaning & explanation
 
04:33
What is DATABASE TUNING? What does DATABASE TUNING mean? DATABASE TUNING meaning - DATABASE TUNING definition - DATABASE TUNING explanation. Source: Wikipedia.org article, adapted under https://creativecommons.org/licenses/by-sa/3.0/ license. SUBSCRIBE to our Google Earth flights channel - https://www.youtube.com/channel/UC6UuCPh7GrXznZi0Hz2YQnQ Database tuning describes a group of activities used to optimize and homogenize the performance of a database. It usually overlaps with query tuning, but refers to design of the database files, selection of the database management system (DBMS) application, and configuration of the database's environment (operating system, CPU, etc.). Database tuning aims to maximize use of system resources to perform work as efficiently and rapidly as possible. Most systems are designed to manage their use of system resources, but there is still much room to improve their efficiency by customizing their settings and configuration for the database and the DBMS. Hardware and software configuration of disk subsystems are examined: RAID levels and configuration, block and stripe size allocation, and the configuration of disks, controller cards, storage cabinets, and external storage systems such as SANs. Transaction logs and temporary spaces are heavy consumers of I/O, and affect performance for all users of the database. Placing them appropriately is crucial. Frequently joined tables and indexes are placed so that as they are requested from file storage, they can be retrieved in parallel from separate disks simultaneously. Frequently accessed tables and indexes are placed on separate disks to balance I/O and prevent read queuing. DBMS tuning refers to tuning of the DBMS and the configuration of the memory and processing resources of the computer running the DBMS. This is typically done through configuring the DBMS, but the resources involved are shared with the host system. Tuning the DBMS can involve setting the recovery interval (time needed to restore the state of data to a particular point in time), assigning parallelism (the breaking up of work from a single query into tasks assigned to different processing resources), and network protocols used to communicate with database consumers. Memory is allocated for data, execution plans, procedure cache, and work space. It is much faster to access data in memory than data on storage, so maintaining a sizable cache of data makes activities perform faster. The same consideration is given to work space. Caching execution plans and procedures means that they are reused instead of recompiled when needed. It is important to take as much memory as possible, while leaving enough for other processes and the OS to use without excessive paging of memory to storage. Processing resources are sometimes assigned to specific activities to improve concurrency. On a server with eight processors, six could be reserved for the DBMS to maximize available processing resources for the database. Database maintenance includes backups, column statistics updates, and defragmentation of data inside the database files. On a heavily used database, the transaction log grows rapidly. Transaction log entries must be removed from the log to make room for future entries. Frequent transaction log backups are smaller, so they interrupt database activity for shorter periods of time. DBMS use statistic histograms to find data in a range against a table or index. Statistics updates should be scheduled frequently and sample as much of the underlying data as possible. Accurate and updated statistics allow query engines to make good decisions about execution plans, as well as efficiently locate data. Defragmentation of table and index data increases efficiency in accessing data. The amount of fragmentation depends on the nature of the data, how it is changed over time, and the amount of free space in database pages to accept inserts of data without creating additional pages.
Views: 607 The Audiopedia
What Is A Mining Shaft?
 
00:45
https://goo.gl/6U6t22 - Subscribe For more Videos ! For more Health Tips | Like | Comment | Share : ▷ CONNECT with us!! #HealthDiaries ► YOUTUBE - https://goo.gl/6U6t22 ► Facebook - https://goo.gl/uTP7zG ► Twitter - https://twitter.com/JuliyaLucy ► G+ Community - https://goo.gl/AfUDpR ► Google + - https://goo.gl/3rcniv ► Visit us - http://healthaware.in/ ► Blogger - https://juliyalucy.blogspot.in/ Watch for more Health Videos: ► How To Avoid Unwanted Pregnancy Naturally: https://goo.gl/hRy93e ► Period Hacks || How To Stop Your Periods Early: https://goo.gl/dSmFgi ► Cold and Flu Home Remedies: https://goo.gl/biPp8b ► Homemade Facial Packs: https://goo.gl/NwV5zj ► How To Lose Belly Fat In 7 Days: https://goo.gl/EHN879 ► Powerfull Foods for Control #Diabetes: https://goo.gl/9SdaLY ► Natural Hand Care Tips At Home That Work: https://goo.gl/YF3Exa ► How to Tighten #SaggingBreast: https://goo.gl/ENnb6b ► Natural Face Pack For Instant Glowing Skin: https://goo.gl/gvd5mM ► Get Rid of Stretch Marks Fast & Permanently: https://goo.gl/ZVYvQZ ► Eating Bananas with Black Spots: https://goo.gl/gXuri6 ► Drink this Juice every day to Cure #Thyroid in 3 Days: https://goo.gl/L3537H ► How Garlic Improves Sexual Stamina? https://goo.gl/GNcbYU ► Benefits of using Egg Shells: https://goo.gl/hAUyUS ► Home Remedies to Gain Weight Fast: https://goo.gl/jBVVQh ► Amazing Benefits of Olive Oil for Health: https://goo.gl/R3583v ► Rapid Relief of Chest Pain (Angina): https://goo.gl/idAFZR ► Home Remedies for Joint & Arthritis Pains Relief: https://goo.gl/jRbNkh ► SHOCKING TRICKs For #Diabetes Control: https://goo.gl/ATDDsV ► Doctors Are Shocked! #Diabetics: https://goo.gl/ZeQddJ ► Home Remedies for Gastric Troubles: https://goo.gl/72VR1b ► Juice for #Diabetics Type 2: https://goo.gl/3vDMqR --------- Collar the point at which a shaft intersects surface or underground haulage level. Googleusercontent search. Which can be found at the depth of earth's surface oct 19, 2009 shaft mining is a form underground using shafts driven vertically from top down into earth to access ore or minerals. N a vertical passageway into mine meanwhile, the larger scale drilling of an escape shaft made slow progress. Mineshaft definition of mineshaft by the free dictionaryhowstuffworksshaft urban dictionary mine shaft. Shaft mining ritchiewikishaft mining, techniques, underground shaft ritchiewiki. Mineshaft synonyms, mineshaft pronunciation, translation, english dictionary definition of mineshafta vertical or sloping passageway made in the earth for finding mining ore and ventilating underground excavations. Key words ore body, design, shaft, vertical, decline, and incline, cost may 30, 2017define shaft mining. A vertical opening is usually called a shaft. Shaft mining wikipedia en. Today, shaft coal mine an area of land and all structures, facilities, machinery, tools, equipment, shafts, slopes, tunnels, excavations, other property, real or personal, placed upon, under, above the surface such by any person, used in extracting from its natural deposits earth means method, definition us english a deep narrow vertical hole, sometimes horizontal tunnel, that gives access to this review describes design, construction sinking procedures along with list consultants, contractors, suppliers shafts. What is shaft mining? does mining mean? Shaft youtube. Shaft mining ritchiewiki. N a vertical passageway into mine define mineshaft. Shaft mining synonyms, shaft pronunciation, translation, english dictionary definition of mininga vertical or sloping passageway made in the earth for finding ore and ventilating underground excavations. It is hoped that this paper will be used throughout industry as a quantitative basis for the comparison of various mine access options both new and existing mines. Working in some of the most extreme conditions and remote locations on earth such as jungles south america, gobi desert mongolia high arctic, redpath has experience expertise to mobilize people during mine shafts construction designers, engineers operational staff have deal with huge information flows various technical spatial data. Aveng mining, through aveng's mining shafts & underground, is a world leader in the shaft sinking and access development market, with ability to sink equip both redpath recognized throughout industry as global. The saga reminded the non mining world of a usually invisible truth. Special user friendly tools which provide an effective access to these data as well the evaluation and forecast of different situations may appear during mine shaft translation spanish, pronunciation, forum discussions. Finally, on day 69, rescuers lifted each of the miners out alive [source boston]. Shaft mining definition of shaft by the free dictionary. It is also called shaft sink
Views: 65 Fredda Winkleman
Blue Origin's New Partnership with PARC, an R&D Services Company
 
05:12
I’m going to talk about Blue Origin’s partnership with research and development services company PARC. Welcome to NeoScribe, if you’re new to my channel, I explore everything that is cool about the future so hit the subscribe button and notification bell so you do miss out! Alright, before we get started, lets talk about Research and Development or R&D. A company’s success and continued success relies on R&D. This is because R&D leads to new products, services or processes. While this may seem critical for mostly tech companies, just about every industry pours resources into R&D. Take the personal care industry for instance, this is a Quilted Northern toilet paper Ad from the 1930’s. Look at the upper right corner, SPLINTER FREE!? Yes, before the 30’s it was common to get splinters from TP. And thanks to R&D, we don’t have to worry about splinters with today’s TP. But as the world and world of business becomes more complex, managing R&D investments have become complex as well. Companies today don’t blindly fund R&D like they did before. They now track and measure the success of their R&D efforts, or return on investment. Because not all R&D efforts lead to profitable products or improvements of products and spending on R&D can be risky for some companies. So, when companies want to invest in R&D in areas that are outside of their expertise or smaller companies who want to minimize risk with R&D, they can look to PARC to help them with their R&D efforts. PARC was founded in 1970 and is an independent subsidiary company of Xerox, headquartered out of Palo Alto California. They provide custom R&D services, to Fortune 500 and Global 1000 companies, startups, and government agencies. They have over 175 world-class Scientist, own 2,000 patents while filing over 150 patents per year, and have written over 4,000 scientific papers. PARC’s Motto is the business of breakthroughs, and according to their brochure: PARC offers the holistic approach you need with the right balance of knowledge and pragmatic action to get more from your innovation dollar. Explore business model implications and the full range of disruptive advances across physical, social, and computer sciences. Over the years, PARC has invented or involved in the invention of many products. But the most impressive or ground-breaking invention was the Xerox Alto computer. The Xerox Alto was the first computer to have a graphical user interface operating system, or GUI. Before GUI’s like windows and Mac OS, computers were not as user friendly and ran on command line interfaces which had a steeper learning curve than simply double-clicking icons. Sources: https://www.parc.com/ https://www.investopedia.com/articles/fundamental-analysis/10/research-development-rorc.asp?lgl=rira-layout-cpa-bsln https://www.investopedia.com/articles/fundamental/03/072303.asp http://whoonew.com/2013/08/green-bay-toilet-paper-wiping-butts/ https://en.wikipedia.org/wiki/Xerox_Alto https://www.news.xerox.com/news/PARC-to-partner-with-Blue-Origin-to-accelerate-space-research-and-development
Views: 2049 NeoScribe
Inge Druckrey: Teaching to See
 
37:01
Para subtítulos en español de click en CC. 中文字幕請按CC. Directed by Andrei Severny, produced by Edward Tufte. The Design Observer: "Perceiving Deeply" http://designobserver.com/feature/perceiving-deeply/38497/ FastCompany: "If you do one thing today, watch this 40-minute crash course in Design Thinking." http://www.fastcodesign.com/1670615/a-40-minute-crash-course-in-design-thinking 99U by Behance: "Watch the first five minutes and you'll be hooked." http://99u.com/workbook/14370/video-how-to-see-like-a-artist Lifehacker: "Beautifully made film." http://lifehacker.com/5991521/learn-to-see-like-an-artist "This [film] is about patient and dedicated teaching, about learning to look and visualize in order to design, about the importance of drawing. It is one designer's personal experience of issues that face all designers, expressed with sympathy and encouragement, and illustrated with examples of Inge [Druckrey]'s own work and that of grateful generations of her students. There are simple phrases that give insights into complex matters, for example that letterforms are 'memories of motion.' Above all, it is characteristic of Inge that in this examination of basic principles the word "beautiful" is used several times." Matthew Carter, type designer, MacArthur Fellow "This film is absolutely beautiful. I'm so impressed with it and learned so much in such a compact piece. I feel like it picked up where Helvetica left off with the subtle principles of typographical balance and some early history stemming from the human hand. Your wonderful teaching approach comes through loud and clear and stands as an inspiration and model for others including myself. This is fantastic." Luke Geissbuhler, Cinematographer of Helvetica and other films "A great documentation of the visual values we hold dear." Roger Remington, Vignelli Distinguished Professor of Design, RIT "A fine, insightful and educational documentary. It captures Inge's work as a designer and educator, her thinking and her SEEING, in a wonderful and most perfect way. Truly Inspirational!" Hans-Ulrich Allemann, Designer/Educator http://TeachingToSee.org http://www.facebook.com/TeachingToSee http://www.imdb.com/title/tt2365382
Views: 59200 Edward Tufte
How to Build a 4K Editing Computer (More cores are not always better) - Smarter Every Day 202
 
09:39
For $50 off select Casper mattresses, go to http://casper.com/smarter and use promo code: smarter (Terms and conditions apply). BEHIND THE SCENES: https://www.youtube.com/watch?v=ph-uq_B5TSI ⇊ Click below for more links! ⇊ HOW TO BUILD A COMPUTER 1. DON'T BUY THE MOST EXPENSIVE MACHINE. 2. RESEARCH ACTUAL BENCHMARK DATA 3. BUY HARDWARE BASED ON YOUR SOFTWARE APPLICATION 4. More cores doesn't mean it's better for you! Side note: The fast rendering capability of this new machine actually let me eat dinner with my family on the first night I used it. This is incredibly important to me. A special thank you to Puget Systems for allowing me to visit and for helping me ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ GET SMARTER SECTION I asked Jon to put together the specs on the computer I spec'd out https://www.pugetsystems.com/go/smarter Amdahl's Law https://en.wikipedia.org/wiki/Amdahl%27s_law Moore's Law https://en.wikipedia.org/wiki/Moore%27s_law ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Tweet Ideas to me at: http://twitter.com/smartereveryday I'm "ilikerockets" on Snapchat. Snap Code: http://i.imgur.com/7DGfEpR.png Smarter Every Day on Facebook https://www.facebook.com/SmarterEveryDay Smarter Every Day on Patreon http://www.patreon.com/smartereveryday Smarter Every Day On Instagram http://www.instagram.com/smartereveryday Smarter Every Day SubReddit http://www.reddit.com/r/smartereveryday Ambiance and musicy things by: Gordon McGladdery did the outro music the video. http://ashellinthepit.bandcamp.com/ The thought is it my efforts making videos will help educate the world as a whole, and one day generate enough revenue to pay for my kids college education. Until then if you appreciate what you've learned in this video and the effort that went in to it, please SHARE THE VIDEO! If you REALLY liked it, feel free to pitch a few dollars Smarter Every Day by becoming a Patron. http://www.patreon.com/smartereveryday Warm Regards, Destin
Views: 580233 SmarterEveryDay
Topographic Maps: "Grid Distance and Elevation" 1966 US Army Training Film
 
28:35
more at http://search.quickfound.net/map_search_and_news.html Basic Map Reading Part II Grid, Distance and Elevation "USE OF MILITARY GRID SYSTEM TO LOCATE POSITIONS; USE OF GRAPHIC SCALES TO MEASURE DISTANCE; USE OF CONTOUR LINES TO IDENTIFY TOPOGRAPHIC DETAIL." US Army Training Film TF5-3719 see also: Direction, Orientation and Location with a Compass https://www.youtube.com/watch?v=nfDFqdqJe2o Public domain film from the National Archives, slightly cropped to remove uneven edges, with the aspect ratio corrected, and mild video noise reduction applied. The soundtrack was also processed with volume normalization, noise reduction, clipping reduction, and equalization (the resulting sound, though not perfect, is far less noisy than the original). http://en.wikipedia.org/wiki/Topographic_map A topographic map is a type of map characterized by large-scale detail and quantitative representation of relief, usually using contour lines in modern mapping, but historically using a variety of methods. Traditional definitions require a topographic map to show both natural and man-made features. A topographic map is typically published as a map series, made up of two or more map sheets that combine to form the whole map. A contour line is a combination of two line segments that connect but do not intersect; these represent elevation on a topographic map. The Canadian Centre for Topographic Information provides this definition of a topographic map: A topographic map is a detailed and accurate graphic representation of cultural and natural features on the ground. Other authors define topographic maps by contrasting them with another type of map; they are distinguished from smaller-scale "chorographic maps" that cover large regions, "planimetric maps" that do not show elevations, and "thematic maps" that focus on specific topics. However, in the vernacular and day to day world, the representation of relief (contours) is popularly held to define the genre, such that even small-scale maps showing relief are commonly (and erroneously, in the technical sense) called "topographic". The study or discipline of topography, while interested in relief, is actually a much broader field of study which takes into account all natural and man made features of terrain. Topographic maps are based on topographical surveys. Performed at large scales, these surveys are called topographical in the old sense of topography, showing a variety of elevations and landforms. This is in contrast to older cadastral surveys, which primarily show property and governmental boundaries. The first multi-sheet topographic map series of an entire country, the Carte géométrique de la France, was completed in 1789. The Great Trigonometric Survey of India, started by the East India Company in 1802, then taken over by the British Raj after 1857 was notable as a successful effort on a larger scale and for accurately determining heights of Himalayan peaks from viewpoints over one hundred miles distant... United States The United States Geological Survey (USGS), a civilian Federal agency, produces several national series of topographic maps which vary in scale and extent, with some wide gaps in coverage, notably the complete absence of 1:50,000 scale topographic maps or their equivalent. The largest (both in terms of scale and quantity) and best-known topographic series is the 7.5-minute or 1:24,000 quadrangle. This scale is unique to the United States, where nearly every other developed nation has introduced a metric 1:25,000 or 1:50,000 large scale topo map. The USGS also publishes 1:100,000 maps covering 30 minutes latitude by one degree longitude, 1:250,000 covering one by two degrees, and state maps at 1:500,000 with California, Michigan and Montana needing two sheets while Texas has four. Alaska is mapped on a single sheet, at scales ranging from 1:1,584,000 to 1:12,000,000. Recent USGS digital US Topo 1:24,000 topo maps based on the National Map omit several important geographic details that were featured in the original USGS topographic map series (1945-1992). Examples of omitted details and features include power transmission lines, telephone lines, railroads, recreational trails, pipelines, survey marks, and buildings. For many of these feature classes, the USGS is working with other agencies to develop data or adapt existing data on missing details that will be included in The National Map and to US Topo. In other areas USGS digital map revisions may omit geographic features such as ruins, mine locations, springs, wells, and even trails in an effort to protect natural resources and the public at large, or because such features are not present in any public domain database...
Views: 6236 Jeff Quitney
Apache Solr Tutorial for Beginners -2 | Apache Lucene Tutorial -2 | Solr Search Tutorial | Edureka
 
02:05:52
( Apache Solr Certification Training - https://www.edureka.co/apache-solr-self-paced ) Watch the sample class recording: http://www.edureka.co/apache-solr?utm_source=youtube&utm_medium=referral&utm_campaign=solr-tut-2 Apache Solr based on the Lucene Library, is an open-source enterprise Grade search engine and platform used to provide fast and scalable search features. Solr, which stands for “Search on Lucene and Resine” was created in 2004 by Yonik Seeley. Its major features include full-text search, hit highlighting, faceted search, dynamic clustering, database integration and rich document (example: Word, PDF) handling. Solr primarily written in Java runs as a standalone full-text search seer within a servlet container along with using the Lucene Java Search Library. 1.Undestand Analyzers 2.Undestand Querying 3.Undestand Scoring 4.Undestand Boosting 5.Undestand Highlighting 6.Undestand Faceting 7.Undestand Grouping 9. Undestand Joins 10.Undestand Spatial Search 11.Undestand Aapche Tika. Related post: http://www.edureka.co/blog/apache-solr-shedding-some-light/?utm_source=youtube&utm_medium=referral&utm_campaign=solr-tut-2 http://www.edureka.co/blog/solr19thoct/?utm_source=youtube&utm_medium=referral&utm_campaign=solr-tut-2 Edureka is a New Age e-learning platform that provides Instructor-Led Live, Online classes for learners who would prefer a hassle free and self paced learning environment, accessible from any part of the world. The topics related to ‘Apache Solr & Lucene' have been covered in our course ‘Apache Solr‘. For more information, please write back to us at [email protected] Call us at US: 1800 275 9730 (toll free) or India: +91-8880862004
Views: 23878 edureka!
The Third Industrial Revolution: A Radical New Sharing Economy
 
01:44:59
The global economy is in crisis. The exponential exhaustion of natural resources, declining productivity, slow growth, rising unemployment, and steep inequality, forces us to rethink our economic models. Where do we go from here? In this feature-length documentary, social and economic theorist Jeremy Rifkin lays out a road map to usher in a new economic system. A Third Industrial Revolution is unfolding with the convergence of three pivotal technologies: an ultra-fast 5G communication internet, a renewable energy internet, and a driverless mobility internet, all connected to the Internet of Things embedded across society and the environment. This 21st century smart digital infrastructure is giving rise to a radical new sharing economy that is transforming the way we manage, power and move economic life. But with climate change now ravaging the planet, it needs to happen fast. Change of this magnitude requires political will and a profound ideological shift. To learn more visit: https://impact.vice.com/thethirdindustrialrevolution Click here to subscribe to VICE: http://bit.ly/Subscribe-to-VICE Check out our full video catalog: http://bit.ly/VICE-Videos Videos, daily editorial and more: http://vice.com More videos from the VICE network: https://www.fb.com/vicevideo Click here to get the best of VICE daily: http://bit.ly/1SquZ6v Like VICE on Facebook: http://fb.com/vice Follow VICE on Twitter: http://twitter.com/vice Follow us on Instagram: http://instagram.com/vice Download VICE on iOS: http://apple.co/28Vgmqz Download VICE on Android: http://bit.ly/28S8Et0
Views: 3232325 VICE
Ethereum Core Devs Meeting #43 [07/27/18]
 
01:34:41
Agenda: https://github.com/ethereum/pm/issues/51
Views: 2343 Ethereum Foundation
What I learned from going blind in space | Chris Hadfield
 
18:23
There's an astronaut saying: In space, "there is no problem so bad that you can't make it worse." So how do you deal with the complexity, the sheer pressure, of dealing with dangerous and scary situations? Retired colonel Chris Hadfield paints a vivid portrait of how to be prepared for the worst in space (and life) -- and it starts with walking into a spider's web. Watch for a special space-y performance. TEDTalks is a daily video podcast of the best talks and performances from the TED Conference, where the world's leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design -- plus science, business, global issues, the arts and much more. Find closed captions and translated subtitles in many languages at http://www.ted.com/translate Follow TED news on Twitter: http://www.twitter.com/tednews Like TED on Facebook: https://www.facebook.com/TED Subscribe to our channel: http://www.youtube.com/user/TEDtalksDirector
Views: 4523872 TED
Final Year Projects 2015 | Evaluating Wiki Collaborative Features in Ontology Authoring
 
17:13
Including Packages ======================= * Complete Source Code * Complete Documentation * Complete Presentation Slides * Flow Diagram * Database File * Screenshots * Execution Procedure * Readme File * Addons * Video Tutorials * Supporting Softwares Specialization ======================= * 24/7 Support * Ticketing System * Voice Conference * Video On Demand * * Remote Connectivity * * Code Customization ** * Document Customization ** * Live Chat Support * Toll Free Support * Call Us:+91 967-774-8277, +91 967-775-1577, +91 958-553-3547 Shop Now @ http://clickmyproject.com Get Discount @ https://goo.gl/lGybbe Chat Now @ http://goo.gl/snglrO Visit Our Channel: http://www.youtube.com/clickmyproject Mail Us: [email protected]
Views: 144 Clickmyproject
H3 Podcast #43 - Vsauce
 
02:19:34
Big appreciation to Vsauce for slammin' with us! Thanks to http://omahasteaks.com (search for h3) & http://lyft.com/h3 & http://meundies.com/h3 & http://joinhoney.com/h3 for sponsoring us! MERCH... http://h3h3shop.com Watch LIVE every Friday at 3pm PST: https://www.twitch.tv/h3h3productions/ ITUNES -- https://goo.gl/desgTE GOOGLE PLAY MUSIC -- https://goo.gl/EnllKV Twitter...................... https://twitter.com/h3h3productions Hila's Twitter............ https://twitter.com/hilakleinh3 MERCH.................... http://h3h3shop.com Instagram................ http://instagram.com/h3h3productions Hila's Instragram..... https://www.instagram.com/kleinhila Website.................... http://h3h3productions.com Subreddit................. http://reddit.com/r/h3h3productions Podcast Theme song by EchoRobot: https://soundcloud.com/aldenchambers/h3h3-podcast-theme-ft-nelward-and-custodian
Views: 2215557 H3 Podcast
David Wilcock | Corey Goode: The Antarctic Atlantis [MUST SEE LIVE DISCLOSURE!]
 
02:30:07
Are we about to hear that ancient ruins have been found in Antarctica? Is there an Alliance working to defeat the greatest threat humanity has ever faced on earth? Could the Antarctic Atlantis be part of a full or partial disclosure? Join David Wilcock on a thrill ride of discovery, beginning with Part One where he presents data on the Secret Space Program and shares the stage with legendary insider Corey Goode. This is the best public summary David and Corey have done of this amazing story that has captivated the UFO community. Part Two begins at the 53-minute mark, with David connecting the dots between intel from multiple insiders to arrive at a stunning conclusion -- that we are on the verge of major new releases of information that will transform everything we thought we knew about life on earth. A civilization of "Pre-Adamite" giants with elongated skulls appears to have crash-landed on a continent we now call Antarctica some 55,000 years ago. Various groups we collectively call the Alliance are working to defeat the Cabal / Illuminati / New World Order, thus making the headlines crazier by the day. If the Alliance succeeds, their plan is now to begin the disclosure process by telling us there was a civilization in Antarctica. We are already seeing multiple, compelling hints of this in corporate media. Find out what the insiders are telling us and help spread the word! This is a two-and-a-half-hour excerpt from David's Friday and Saturday presentations at the Conscious Life Expo 2017. In their original form they run six hours. David also spoke for three hours on Monday, presenting an incredible new model of the Cosmos based on sacred geometry -- and proving that the Sun is going to release a DNA-transforming burst of energy in our near future. Reposting this video is stealing, so please share the link with your friends but do not re-upload it anywhere else. Our team does issue takedowns and it could lead to the loss of your channel. Please help us by subscribing to this channel! Sign up at http://dwilcock.com to be notified of new articles and videos upon release, and to get free gifts and Ascension updates from David as they become available. Thank you for your support! David's OFFICIAL Patreon site: https://www.patreon.com/dwilcock
Test Your Knowledge: Database Basics
 
02:17
The following 5 questions relate to database basics as a part of the information Systems & databases unit of the IPT Course: 1. List the two types of databases discussed in this unit 2. What is the name of a database comprised of a single table? 3. List the six areas of a Data Dictionary 4. Define the term ‘Relational Database’ 5. What are the three elements of a Database Schema?
The "Is-is" Epidemic / Steve's Grammatical Observations (ep.1)
 
02:02
You need say is only once, in certain contexts.
Views: 149917 Stepthos
Impact Acceleration: Astrostatistics
 
03:48
Dr Roberto Trotta reveals how he uses astrostatistics to solve everyday problems. Find out more https://www.imperial.ac.uk/research-and-innovation/funding-opportunities/internal-funding-opportunities/impact-acceleration-accounts/
Team Gushue Highway – Kenmount Rd. to Topsail Rd.
 
01:04
Drone video of the Team Gushue Highway on August 3, 2018. Other recent work completed this construction season includes the placement of the concrete median along the route, wiring for street lighting, and the construction of an underpass near Topsail Road and a roundabout at the Blackmarsh Road intersection. More information: http://www.releases.gov.nl.ca/releases/2018/tw/0813n02.aspx
Views: 18112 GovNL
Learn to capture knowledge through graphical and formal ontology techniques
 
01:22
https://www.udemy.com/practical-knowledge-modelling/ FREE for a limited period with coupon code KNOW-101
Views: 826 Tish
After watching this, your brain will not be the same | Lara Boyd | TEDxVancouver
 
14:25
In a classic research-based TEDx Talk, Dr. Lara Boyd describes how neuroplasticity gives you the power to shape the brain you want. Recorded at TEDxVancouver at Rogers Arena on November 14, 2015. YouTube Tags: brain science, brain, stroke, neuroplasticity, science, motor learning, identity, TED, TEDxVancouver, TEDxVancouver 2015, Vancouver, TEDx, Rogers Arena, Vancouver speakers, Vancouver conference, ideas worth spreading, great idea, Our knowledge of the brain is evolving at a breathtaking pace, and Dr. Lara Boyd is positioned at the cutting edge of these discoveries. In 2006, she was recruited by the University of British Columbia to become the Canada Research Chair in Neurobiology and Motor Learning. Since that time she has established the Brain Behaviour Lab, recruited and trained over 40 graduate students, published more than 80 papers and been awarded over $5 million in funding. Dr. Boyd’s efforts are leading to the development of novel, and more effective, therapeutics for individuals with brain damage, but they are also shedding light on broader applications. By learning new concepts, taking advantage of opportunities, and participating in new activities, you are physically changing who you are, and opening up a world of endless possibility. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx
Views: 23444230 TEDx Talks