This use case reports on the impressive output, hallucinations, instability, and limitations of t... more This use case reports on the impressive output, hallucinations, instability, and limitations of three Large Langue Models (LLMs): ChatGPT, Gemini, and Grok. The LLMs were prompted in an investigative sequence and responses checked. The collected information supports the common theory that chronic wasting disease (CWD) in North America originated in 1967 at a research facility in Fort Collins, Colorado, where deer were reported to have been exposed to sheep with a similar disease - scrapie. Findings include that: no sheep with scrapie were detected in the area around Fort Collins prior to 1967; domestic sheep reportedly exposed to scrapie were in the facility; there were medical experiments; Fort Collins was active in the scrapie eradication program; three early infection sites, all linked to Fort Collins, are missing from USGS maps showing the disease history; the recently discovered European CWD cases can be explained by local conditions; scrapie in sheep and deer with CWD symptoms were reported centuries ago in Europe; early models simulating disease history lacked adequate data and detail, and ignored the presence of infected captive herds. The LLMs provided good insight into disease simulation, created simulation models, and generated python code.
An Economic Model of Cable Television: Franchise Bidding in Philadelphia
A proliferation of telecommunications services and a relaxation of federal regulations have encou... more A proliferation of telecommunications services and a relaxation of federal regulations have encouraged the construction of cable television systems in urban markets. The current practice of allowing competitive bidding for the exclusive franchise to build a cable system is intended to provide monopoly services at competitive prices without the need of extensive public regulation. In this dissertation an economic model is developed to address the question of efficient (Ramsey) prices and optimal system configuration for a multi-product cable system. Econometric and market research techniques are combined to produce equations describing the demand for cable television. An engineering analysis and econometric equations describe costs. The fair rate of return on equity invested in the system is estimated using the Capital Asset Pricing Model. All equations are combined in an interactive, computer model formatted as a pro-forma return-on-investment analysis. Significant economies of scale are measured for the number of subscribers served by a system and for the size of the geographic area. Demand interdependencies among cable television services suggest that the monthly subscriber fee for basic-cable service should decline in real terms as the number of pay-television services increases. A microanalytic examination of the 1979 franchise bidding in Philadelphia reveals that one efficiently priced system resulted when the number of bidders was large. Most prices, however, were set at about industry averages. Competitive bidding did generally produced optimal system configurations, with the exception that government officials encouraged excessive investment in public facilities
Historical search data, describing the volume of searches by topic and region, have recently beco... more Historical search data, describing the volume of searches by topic and region, have recently become freely available. This provides a potentially valuable source of data useful for business intelligence about conditions external to the organization where data is sometimes sparse. As an experiment for a business application, Google searches on the keyword “foreclosure” were correlated with actual U.S. home foreclosures over the past 4 years. The resulting regression analysis shows a very good correlation, indicating that searches on “foreclosure” provide a very accurate estimate of trends in actual U.S. home foreclosures and may provide an early warning system. In a related non-business experiment, Google has recently reported success in showing that searches on the term “flu” track closely with worldwide outbreaks of flu.
Information for decision making may be publicly available, but costly to obtain. As an experiment... more Information for decision making may be publicly available, but costly to obtain. As an experiment in environmental scanning, the internet was searched on a daily basis over several years to collect information and provide analysis related to decisions on deer management. The process discovered that, contrary to common assumptions, the U.S. deer population has apparently been falling since about the year 2000 based on analysis of available state data that had not been aggregated. In some cases, state population estimates were created using standard procedures on available data. Results indicate that differences in survey methods appear to be relatively constant over time as does the ratio of hunting data to official state population estimates. While reliability intervals for population estimates are wide, population trend reliability is relatively high. An analysis of Connecticut and California illustrate problems with the population estimates. In Connecticut, an independent group that financed some local surveys assert the state has overestimated the population. In California, some population estimates reported to the public are inconsistent with historic information, masking the dramatic decline of the deer population in the state.
Systems analysts and managers involved in planning for computer system deployments and upgrades o... more Systems analysts and managers involved in planning for computer system deployments and upgrades often require an estimate of future component performance. With a focus on processors used in desktop systems, this paper examines how long run technical trends, such as expressed in Moore’s Law, can be used to predict future computing performance. Historical data on the number of transistors and clock speeds were collected for Intel microprocessors beginning with their introduction in 1971. The data show that Moore’s Law significantly overestimates long-term development which has been increasing but at a decreasing rate. Results of experiments with other prediction equations are also presented.
As part of an effort to keep an MIS curriculum in line with market demand, a sample of job postin... more As part of an effort to keep an MIS curriculum in line with market demand, a sample of job postings from Monster.com was taken over a one year period. A list of most frequently requested skills and knowledge was created for jobs requiring a bachelor’s degree in information systems (IS) or management information systems (MIS). The results support the importance of verbal and written communications. The skills lists were further broken down into four traditional career paths: database, networking, systems analysis, and programming. A cluster analysis of the data revealed three basic skill groups: analysis, programming, and networking.
Figure 3 in this Journal of Medical Entomology article is central to the authors’ warning about a... more Figure 3 in this Journal of Medical Entomology article is central to the authors’ warning about an exploding white-tailed deer population but conflicts in important aspects with the relevant deer research. Among other problems, it shows a 60% increase in the white-tailed deer density from 1500 to 2020 when the research consensus is that the population is about the same. It shows an exploding population from 2000 to 2020 without supporting data when the population peaked around the year 2000 according to evidence-based research.
Using freely available internet search tools for environmental scanning, information related to d... more Using freely available internet search tools for environmental scanning, information related to deer management was collected, categorized, and evaluated with the goal of providing public decision support. Key issues raised in the public debate discovered by the search are addressed with relevant information formatted as output for a decision support system-dashboard elements. A graph addresses contradictory reports about the current direction of the deer population; the trend since 2006 appears to be down. Another graph illustrates the approximate longterm population trend; the current U.S. white-tailed deer population is about the same as in 1500. A table summarizes profiles of state deer issues and strategies. Only eleven states are trying to reduce their deer population. A graph illustrates the rise and fall of the California population, the most dramatic population decline in the U.S. over the past 100 years. Hunting pressure and herd demographic management are found to be related to the decline, making these candidate variables for attention in the decision support system. This case application is designed to illustrate methods the author has learned in creating a variety of decision support applications for technology companies.
Large wildfires have been a recent focus of public concern in California and other western states... more Large wildfires have been a recent focus of public concern in California and other western states. To provide public access to relevant information, a website knowledge base was developed using the new Google sites tool. Information collection and data analysis were based on an ongoing internet search of the issues in the public discussion. Data analysis includes statistical tests of some common factors proposed in the public discussion related to climate change and forest density. Findings include that data starting from 1932 show annual acres burned in Cal Fire jurisdictions have been about constant. Data from 1987 show that total acres burned increased and were correlated to increased maximum temperature, and that that wildfires have become larger but less frequent. A decline in logging activity was strongly correlated to increased fire size and reduced deer populations. Drought was also correlated to increased fire size and fewer deer. A survey of students indicates that the public has conflicting perceptions about forest density. Many more reported having received information that reduced logging to increase forest density will reduce wildfire risk, contrary to what the data and public information indicate: that reduced logging has increased forest density and large wildfire risk.
Rapid acceptance and development of social network applications provide opportunities to gather d... more Rapid acceptance and development of social network applications provide opportunities to gather data at low costs and to allow for coordinated use of this information for management. The phrase "social management networks" is proposed to focus study on social networks used for management objectives. This paper examines frequency of postings on Facebook groups related to wildlife management in order to see how this application is being developed to support these social management networks. Also, a survey of college business students is analyzed to see how social networking applications might be used to support environmental scanning using surveys. The results suggest that many social management networks have only modest or little success, but there are some significant successes. Postings in Facebook groups tend to be linearly related to group size, except when very large groups are also considered. Online surveys significantly increase participation compared to mailed surveys as measured by intended participation rates. Adding Facebook, MySpace, or Twitter support can marginally improve participation rates for an online survey.
A web-based decision support site for public management of deer was created from an intensive dai... more A web-based decision support site for public management of deer was created from an intensive daily internet search and from targeted searches. Relying on news and other sources, information has been organized based on key decision issues. This information has also been used to build a decision support simulation using detailed demographic and other data related to deer population management. Information acquisition issues and an application of the simulation are illustrated using a case study in San Jose, California, where a deer sterilization project has reduced the population below a desired target and appears destined to result in eradication of the local deer. Trail cameras were used to gather site specific information. Previous simulation approaches lacked demographic detail and were based on very narrow geographic samples, resulting in unreliable predictions for San Jose. The simulation model presented here is being tested against a very large geographic sample of cases. An anomalous case reported at Cornell, New York, is contradicted by other results and may be a result of data issues.
As new computer hardware becomes available offering better performance at a lower price, computer... more As new computer hardware becomes available offering better performance at a lower price, computer accessibility rapidly improves resulting in dramatic changes to society. Planners in business and other organizations need an estimate of future prices and performance to help design their systems or to anticipate the effect of these changes. This paper presents a new set of historical annual data from 1987 to 2010 defining basic price to performance measurements for computer components including processors, hard drives, random access memory, and network interface cards. Two approaches to extrapolating price to performance are evaluated, the industry learning curve and a constant rate of increase implied by Moore’s Law. Regression analysis of this new dataset shows long-term, stable improvements in price to performance consistent with Moore’s Law provide a very good fit of historical data and a better approach to extrapolating futre price to performance than a learning cuve approach. Pr...
As a network technology, ethernet flourished in low-cost, low-end markets. Simple to make and wit... more As a network technology, ethernet flourished in low-cost, low-end markets. Simple to make and with open standards, many companies created products. The resulting improvement in price, performance, and market acceptance resulted in ethernet replacing the more established and sophisticated token-ring technology that dominated early large corporate LANs. As ethernet gets faster, accelerating from the original 10 Mbps into Gigabit speeds, the technology is poised to challenge the dominant backbone and WAN standard, ATM. A discussion of new ethernet developments is formalized with a decision model used to define a market boundary with data illustrating why and where a technology may dominate.
While the cost of storage devices such as hard disk drives continues to fall, the overall proport... more While the cost of storage devices such as hard disk drives continues to fall, the overall proportion of computer network costs dedicated to storage continues to rise. Within a few years, storage will account for 50 percent of total network hardware and software costs. Since computer networks typically have long lives, the design process often involves projecting costs many years into the future. This paper examines a rule based on a technology trend that can be used to estimate the cost per megabyte of hard disk drives. The rule is similar to the well-known Moore’s law that has reliably summarized integrated circuit advances over the past several decades. A statistical analysis of historic data suggests the rule for hard disk drives captures much of the readily available information.
As an experiment investigating social media as a data source for making management decisions, pho... more As an experiment investigating social media as a data source for making management decisions, photo sharing websites were searched for data on deer sightings. Data about deer density and location are important factors in decisions related to herd management and transportation safety, but such data are often limited or not available. Results indicate that when combined with simple rules, data from photo sharing websites reliably predicted the location of road segments with high risk for deer-vehicle collisions as reported by volunteers to an internet site tracking roadkill. Use of Google Maps as the GIS platform was helpful in plotting and sharing data, measuring road segments and other distances, and overlaying geographical data. The ability to view satellite images and panoramic street views proved to be a particularly useful. As a general conclusion, the two independently collected sets of data from social media provided consistent information, suggesting investigative value to this data source. Overlaying two independently collected data sets can be a useful step in evaluating or mitigating reporting bias and human error in data taken from social media.
An ongoing project to investigate the use of the internet as an information source for decision s... more An ongoing project to investigate the use of the internet as an information source for decision support identified the decline of the California deer population as a significant issue. Using Google Alerts, an automated keyword search tool, text and numerical data were collected from a daily internet search and categorized by region and topic to allow for identification of information trends. This simple data mining approach determined that California is one of only four states that do not currently report total, finalized deer harvest (kill) data online and that it is the only state that has reduced the amount of information made available over the internet in recent years. Contradictory information identified by the internet data mining prompted the analysis described in this paper indicating that the graphical information presented on the California Fish and Wildlife website significantly understates the severity of the deer population decline over the past 50 years. This paper pr...
Historical search data, describing the volume of searches by topic and region, have recently beco... more Historical search data, describing the volume of searches by topic and region, have recently become freely available. This provides a potentially valuable source of data useful for business intelligence about conditions external to the organization where data is sometimes sparse. As an experiment for a business application, Google searches on the keyword "foreclosure" were correlated with actual U.S. home foreclosures over the past 4 years. The resulting regression analysis shows a very good correlation, indicating that searches on "foreclosure" provide a very accurate estimate of trends in actual U.S. home foreclosures and may provide an early warning system. In a related non-business experiment, Google has recently reported success in showing that searches on the term "flu" track closely with worldwide outbreaks of flu.
Uploads
Papers by G. Kent Webb