Learning analytics. Predictions? Understanding?

The following post is airing something that I am thinking about with regards to my PHD. It is some early thinking, so apologies in advance if it turns out to be nonsense.

Something that has concerned me for a while with regards to learning analytics, is the use of terms like ‘prediction’ and ‘understanding’. For example:

“Research in learning analytics and its closely related field of education data mining, has demonstrated much potential for understanding and optimizing the learning process”
(Siemens & Baker, 2012)

The Society for Learning Analytics Research defines Learning Analytics as: “… the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs”
(Siemens & Baker, 2012).

“… learning analytics (LA) offers the capacity to investigate the rising tide of learner data with the goal of understanding the activities and behaviors associated with effective learning”
(Macfadyen, Dawson, Pardo, & Gasevic, 2014)

“The intention is to develop models, algorithms, and processes that can be widely used. Transferability is a key factor here; analytic and predictive models need to be reliable and valid at a scale beyond the individual course or cohort”
(Ferguson et al., 2014)

“Learning analytics can penetrate the fog of uncertainty around how to allocate resources, develop competitive advantages, and most important, improve the quality and value of the learning experience” (Siemens & Long, 2011)

“Learning analytics is a technology on the rise in higher education. Adapted from business analytics, this approach is being used to track and predict student engagement and success in higher education.” (Lodge, 2011)

“There is evidence to suggest that learning analytics can be successfully used to predict students at risk of failing or withdrawing and allow for the provision of just in time intervention.”
(Lodge, 2011)

“While the use of learning analytics to track and predict student success in higher education is rapidly becoming mainstream practice in higher education institutions, it is predominantly being used to predict and prevent student attrition.”
(Lodge & Lewis, 2012)

“Sophisticated systems might even recommend learning activities, predict a student’s success or give advice”
(Dyckhoff, Lukarov, Muslim, Chatti, & Schroeder, 2013)

“… much of the early work has focused on the statistical prediction of outcomes, chiefly grades or retention, by relating these target variables to predictor variables harvested from students’ demographic and institutional variables, and their interaction with the LMS”
(Rogers, 2015)

These quotes are from a two-minute search through my Endnote library looking for terms like ‘prediction’ or ‘understanding’. There are a lot more but you get the point. The trouble is that I am becoming increasingly skeptical about that ability of learning analytics to contribute to prediction or even understanding. The following is an attempt to explain this skepticism based on my thinking, where learning analytics is data or information arising from interactions occurring within and between complex adaptive systems (Beer, Jones, & Clark, 2012).

Broadly speaking, there are assumptions inherent in terms like ‘prediction’ and ‘understanding’. They both, to some extent, assume that certainty and full knowledge can be attained, and that the agents and systems involved are fixed and will not evolve (Allen & Boulton, 2011). Likewise, prediction is based on a snapshot in time and cannot capture the impact of interactions between the agents and systems after the snapshot is taken. The snapshot is essentially only a view of something that is in transition (Allen & Boulton, 2011). There are also assumptions related to the stability, immovability or changelessness of:

  • The initial system’s starting state
  • The mechanisms that link the variables
  • The internal responses inside each agent or element
  • The system’s environment

The only way that prediction becomes possible is when the system or agent is isolated from external influences; something that can only ever occur in laboratory conditions (Allen & Boulton, 2011). Uncertainty is always present when we are talking about complex systems. The only way we can banish uncertainty is by making assumptions; something I am not sure is possible when we are talking about systems with many agents interacting in non-linear ways. For learning analytics, the power of prediction implicit in deterministic models can only be realised if the assumptions made in their creation are true (Allen & Boulton, 2011). My sense is that there are many people looking closely at the predictive potential of learning analytics, myself included. I am beginning to question why, especially when prediction, control and complete understanding are always an illusion, except in exceptional, controlled, closed and fixed situations (Allen & Boulton, 2011).

I am speculating, but I wonder if the predictive potential we perceive might be in learning analytics, how much of it is because we have constrained the system. For example, most universities have a single learning management system, a single student information system, some form of homogenized approach to course design, delivery so on and so forth. Have we suppressed the evolutionary potential to such an extent that we have created an environment that has made prediction and understanding possible?

My quick scan of the Endnote library also revealed a couple of articles that fit with some of the things I have mentioned here. The following from Doug Clow:

“Learning analytics is widely seen as entailing a feedback loop, where ‘actionable intelligence’ is produced from data about leaners and their contexts, and interventions are made with the aim of improving learning”
(Clow, 2014)

I intend to further explore this paper for a couple of reasons. It seems to align nicely with my thinking around the applicability of situation awareness and there also appears to be a, albeit limited, attempt at distributed cognition with regards to operationalization of learning analytics. And the following from Jason Lodge has some real gems that I need to look at more closely than I have to date:

“LA is unable to elucidate the student approach to learning, relationships between apparent levels of engagement online and overall student experiences, and is therefore limited as a measure of the process and pathways students may undertake to complete their learning, let alone for higher cognitive processes or ways of being” (Lodge & Lewis, 2012)

References

Allen, P., & Boulton, J. (2011). Complexity and limits to knowledge: The importance of uncertainty. The SAGE handbook of complexity and management, 164-181.

Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the ASCILITE2012 Future challenges, sustainable futures, Wellingtone, New Zealand.

Clow, D. (2014). Data wranglers: human interpreters to help close the feedback loop. Paper presented at the Proceedings of the Fourth International Conference on Learning Analytics And Knowledge.

Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. (2013). Supporting action research with learning analytics. Paper presented at the Proceedings of the Third International Conference on Learning Analytics and Knowledge.

Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S., & Alexander, S. (2014). Setting learning analytics in context: overcoming the barriers to large-scale adoption. Paper presented at the Proceedings of the Fourth International Conference on Learning Analytics And Knowledge.

Lodge, J. (2011). What if student attrition was treated like an illness? An epidemiological model for learning analytics. Paper presented at the ASCILITE – Australian Society for Computers in Learning in Tertiary Education Annual Conference 2011. http://www.editlib.org/p/43627

Lodge, J., & Lewis, M. (2012). Pigeon pecks and mouse clicks: Putting the learning back into learning analytics

. Paper presented at the ASCILITE 2012,, Wellington.

Macfadyen, L. P., Dawson, S., Pardo, A., & Gasevic, D. (2014). Big Data in Complex Educational Systems: The Learning Analytics Imperative and the Policy Challenge. Research & Practice in Assessment, 9(Winter, 2014), 11. Retrieved from http://go.galegroup.com/ps/i.do?action=interpret&v=2.1&u=cqu&it=JIourl&issn=2161-4210&authCount=1&p=AONE&sw=w&selfRedirect=true

Rogers, T. (2015). Critical realism and learning analytics research: epistemological implications of an ontological foundation. Paper presented at the Proceedings of the Fifth International Conference on Learning Analytics And Knowledge, Poughkeepsie, New York.

Siemens, G., & Baker, R. S. J. d. (2012). Learning analytics and educational data mining: towards communication and collaboration. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, British Columbia, Canada. http://delivery.acm.org/10.1145/2340000/2330661/p252-siemens.pdf?ip=138.77.2.133&acc=ACTIVE SERVICE&CFID=145100934&CFTOKEN=10069569&__acm__=1345679404_18f65a315d7b4ba9014a8f150ad6189c

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. Educause Review, 46(5), 9. Retrieved from http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education

 

Advertisements

A blast from the past

This post is a winding back the clock a few years to take another look at a correlation we noticed some time back. A correlation that is somewhat curious.

Ever since David and I started the Indicators project way-back-when, we have considered student clicks within the learning management system (LMS) as an indicator (and only an indicator) of student behavior. This was based on correlations between student clicks on the LMS and their resulting grade as per the following chart (updated recently).

Screen Shot 2015-12-01 at 10.27.40 AM

We have no way of knowing what a click means as I suspect some students are like me in that they randomly click around the place (as a way of procrastinating in my case). There’s a good paper talking about this from Jason Lodge and Melinda Lewis from 2012.

We know that clicks are meaningless to a large extent but they are one of the few unobtrusive indicators we can easily extract from the LMS. On average more clicks within the LMS == better grades. Yes I know! This is not universally true. It’s just an average. Our own 2012 paper suggested why this isn’t universally true.

One thing we did notice back then was not so much the quantity of clicks that the students were making, but the variety of different content items that they were clicking on. What happens if we consider a click by a student on a particular activity or resource as a connection and disregard how many times they click on that particular activity or resource?

So using the same dataset (n=34930) as the previous chart above, the following chart is showing the average number of connections for each student grade.

Screen Shot 2015-12-01 at 10.38.42 AM

To me, this looks very similar to the trend from the previous chart that showed clicks against grade. However what I did find very interesting is the average number of clicks that each student grade group made on each individual content item.

Grade group Average clicks per course element
HD 5.6
D 5.4
C 5.6
P 5.7
F 4.8

I found this interesting because of the apparent lack of variation between the different grades. Broadly speaking, each grade group makes roughly similar amounts of clicks on each activity and resource within Moodle. However the higher achieving students click on a larger proportion of the course activities and resources but don’t necessarily click more on each individual element. I guess I’m not surprised as, if we take a network perspective; the higher achieving students have a greater number of nodes in their network than the lower achieving students. Something that the SNA folk have known for some time.

 

 

More learning analytics ponderings

There are many publications and presentations espousing the potential of learning analytics to contribute to improved decision-making in education. For example:

“Basing decisions on data and evidence seems stunningly obvious, and indeed, research indicates that data-driven decision- making improves organizational output and productivity.”
(Siemens and Long 2011)

This quote was based on very interesting research that showed improved performance across a bunch of publicly listed companies that better utilized data-driven decision-making (Brynjolfsson, Hitt, and Kim 2011). I like the Siemens quote and the research it was based on for two reasons, one good and one bad. Firstly, I work in a regional Australian university and I know that higher education can do a lot better with regards to making decisions based on evidence. Secondly, the Siemen’s quote appeals to me because (a) I have great respect for his opinion and (b) there is a part of me that believes that better data and information will lead to better decisions.

Note that I used the work ‘believes’ deliberately in the previous sentence. Upon reflection, this is probably more of a faith thing than a scientific reality. The trouble is that better and particularly more detailed data does not necessarily lead to better sense-making and/or decision-making (Aaltonen 2008). Human beings are just not wired that way.

“We are a bricolage of cognition, emotion, intuition, information consumption, doubt and belief”
(Siemens 2006)

This highlights the importance of including human beings in the learning analytics cycle. Something that Doug Clow has previously noted:

“Previous work in the literature has emphasised the need for and value of human meaning-making in the process of interpretation of data to transform it in to actionable intelligence.”
(Clow 2014)

So even though we know that humans are essential in the information / learning analytics cycle, we also know that humans are bad at making ‘rational’ decisions based on data. We filter our observations of the world through our cognitive frameworks. Our frameworks are individual to each and every one of us and include things such as experience, intuition and instinct. Throw in that learning analytics is closely coupled with IT where the main considerations are likely to be precision, rigor and reproducibility rather than the human consumers (Norman 1993).

“The logic behind many investments in IT tools and big data initiatives is that giving managers more high-quality information more rapidly will improve their decisions and help them solve problems and gain valuable insights. That is a fallacy.”
(Marchand and Peppard 2013)

A long way of saying that I’m wondering if we need to spend more time on the recipients of the information rather than the data side of things?

 References

Aaltonen, Mika. 2008. “Multi-ontology, sense-making and the emergence of the future.” Futures 41:279-283. doi: 10.1016/j.futures.2008.11.017.

Brynjolfsson, Erik, Lorin M Hitt, and Heekyung Hellen Kim. 2011. “Strength in numbers: How does data-driven decisionmaking affect firm performance?” Available at SSRN 1819486.

Clow, Doug. 2014. “Data wranglers: human interpreters to help close the feedback loop.” Proceedings of the Fourth International Conference on Learning Analytics And Knowledge.

Marchand, Donald A., and Joe Peppard. 2013. “Why IT Fumbles Analytics.” Harvard Business Review 91 (1):104-112.

Norman, Donald A. 1993. Things that make us smart: Defending human attributes in the age of the machine: Basic Books.

Siemens, George. 2006. Knowing knowledge: Lulu. com.

Siemens, George, and Phillip Long. 2011. “Penetrating the Fog: Analytics in Learning and Education.” EDUCAUSE Review 46 (5):9.

 

A possible indicator of faddism?

I have been preparing a document for my PHD supervisors that makes mention of a paper we wrote last year. The paper titled “Three paths for learning analytics and beyond: Moving from Rhetoric to Reality” talks about the dangers associated with learning analytics and management fads, and how the hype around technological concepts can swamp deliberate and mindful adoption and implementation.

While looking at this paper I conducted a quick search on Google scholar, year by year using the search term “learning analytics”. While it’s not particularly scientific, the trend is interesting nonetheless.

Screen Shot 2015-10-30 at 11.02.54 AM

Peer review of a colleagues assessment item

This post is a quick peer review of a design based research proposal a colleague has developed for a unit in their masters where they are required to demonstrate peer review. Good luck Rebecca!

The proposal is centered on a short course, five weeks in duration that is offered to high school students so as to provide them with some insight into tertiary education. The course uses Conley’s model of college readiness to guide what is taught in the course, which includes the following facets of readiness:

  • Key cognitive strategies
  • Academic knowledge and skills
  • Academic behaviours
  • Contextual skills and awareness

It is clear that there are some issues associated with the short course as it stands now. Some of these issues resonated with me as they are not limited to just this short course. For example, one of the problems mentioned is linked with the dominant online course delivery mechanism in higher education, the learning management system (LMS). Rebecca points out that LMS delivered courses have transactional distance, are instructor led and have to be completed in an allocated timeframe. According to the proposal introduction, the style of teaching and learning afforded by the LMS is not constructivist, connectivist or conducive to learner autonomy and critical thinking. All sentiments I agree with to some extent. However, given the dominance of LMS as the way that eLearning is delivered in higher education (Coates, James, & Baldwin, 2005), and given that this is a preparatory course for high school students, it seems appropriate that future students gain some experience with this medium, warts and all.

The proposal mentions moving towards a “more heutagogical approach”, I assume to offset some of the limitations associated with LMS based eLearning. I’m no expert at Heutagogy but I must admit that the idea of self determined learning is very attractive in comparison to the current approaches to eLearning. The proposal also considers the student cohort, many of who live in rural or remote areas, and come from low socio-economic backgrounds. This can correspond to limited access to technology, such as reliable broadband internet connections.

The proposal describes three research questions:

  1. Does restructuring the Preparation for Success in Health course to include a Heutagogical approach, allow students to collaborate, critically reflect and provide feedback in an open online environment?

  2. Does shifting the knowledge acquisition into the students’ hands mean they will access a wider variety of sources of information, including health professionals to answer their questions and build on their own ideas of what appropriate knowledge is?

  3. Will the students engage in the Preparation for Success in Health course more authentically if allowed to be more self-directed in their approach to learning, thus engaging in deeper cognitive learning?

The proposal’s literature review hinges upon student centred learning and suggests that Heutagogy might be an appropriate framework for digital age learning. In a Heutagogical approach, the learners are highly autonomous and the focus is on building the learner’s capacity to learn. Experiential and reflective learning is preferred over ‘transmissive’ approaches. On the surface at least, Heutogogy seems to have a lot in common with the personal learning environment literature from a while back. Even the graduate/generic attributes folk talk about some of this albeit from a different perspective, as do the problem based learning folk. The double-loop learning approach described in the proposal interests me as it links nicely to my PHD around complex adaptive systems. Non-linear learning, to me, is a better match for how people really learn from an anthropological perspective, yet the dominant socio-technical approach is very linear (IMHO).

Some feedback on the proposal:

Very interesting and worthwhile proposal. I’ll be disappointed if it doesn’t happen. Have you considered an internal SOLT grant?

  • The chosen methodology is design based research or DBR. This needs to be unpacked more comprehensively. It is not clear to me how the implementation plan links with the methodology. DBR does seem to be driving the methodology behind implementation plan, but it’s not explicit.
  • I’m not sure about the three research questions. All together they describe a very broad scope that includes a huge body of literature. My advice would be to narrow the scope somewhat. The first research question is great, and, in my mind, enough.

More specifically:

  • Conley’s model needs to be unpacked in the introduction.
  • Heutagogy is introduced in the introduction without introduction. I would suggest that describing the problem in the introduction, without Heutagogy, is the way forward. Then introduce it as an alternative framework in the body of the proposal.
  • The proposal could benefit from an abstract/paragraph/exec summary to help set the scene for the introduction section.
  • The proposal touches on constuctivism, connectivism and technology. Might be just my pattern-entrainment but this could be a nice way to articulate how this proposal is different and is challenging the status quo. Some of the technological issues associated with the LMS, David has unpacked here.

All in all, a very interesting and worthwhile proposal. Well done and good luck.

References

Coates, H., James, R., & Baldwin, G. (2005). A critical examination of the effects of learning management systems on university teaching and learning. Tertiary education and management, 11(2005), 19-36.

The ‘wickedness’ of student attrition and retention

Earlier this year Dr Celeste Lawson and I wrote a couple of papers (still in review) about student attrition in Australian Higher Education. In the first paper we looked at the actual nature of the student attrition problem while in the second, we looked at the approaches that universities took in their attempts to address student attrition. This post is focusing on the first paper in which we questioned the way that universities conceptualized their student attrition problems. Note: I should add that referring to student attrition as a problem is probably wrong. It creates a negative impression of the situation and and also implies that it has to be solved. This is perhaps the wrong way to think about student retention and attrition.

“Reasons for student non-completion are complex”

(Maher & Macallister, 2013)

We analyzed a survey of students who started, but failed to finish their degrees. The results were pretty much what folk in higher education have come to expect. Students leave for reasons such work commitments, family commitments, financial problems, personal problems, health problems and so on; The usual array of reasons that are found throughout the student attrition literature.

If we consider student attrition as a problem within a linear (causal) system, (which as a sector we tend to do) these factors can be addressed systematically within the organizational hierarchy. For example, many students mentioned work commitments as a significant factor in their decision to leave the university. The typical university response would be to perhaps develop an instructional time-management module for new students; or allocate a learning support person who can help students with their study load; or provide a service whereby students could receive advice on how to better balance their work-study life. All of which are valid responses if the problem was single dimensional.

We conducted a content analysis on the free text comments that students made within these surveys and looked closely at the factors that led to attrition. We found that it was the accumulation of factors that led to the students dropping out, and not single reasons. The complex interplay between a range of factors, and the students’ context ends in their premature departure from university. The following diagram from our paper attempts to visualize this by showing relationships between attrition factors. Note that the strength of the line between the factors indicators the frequency in which the factors appeared together:

Screen Shot 2015-10-14 at 8.14.27 AM

What is not shown here (and is perhaps an avenue for future research) is that a similar diagram showing weighted interactions between contributing factors, could be developed for individual students. So from a university perspective, we have, not a single or even a series of issues to address, but a complex network of context dependent issues. Many of which are beyond our ability to address, or even perceive. Add to this that even the small subset of contributing factors shown above, have dependencies at multiple levels. For example, a student might identify as struggling with a financial situation that could have been externally triggered at a local, regional or national level.

It appears we have a complex web of inter-related and temporal factors that can contribute to a student withdrawing from their studies. We describe this in the paper as a wicked problem, which I have mentioned before. Wicked problems are difficult to define, have many inter dependencies, are multi-causal, unstable and socially complex (Briggs, 2007). Importantly, traditional bureaucracies with their vertical silos are unable to tackle wicked problems that are ambiguous and lack clarity. Traditional bureaucracies are also risk adverse which can inhibit the innovation, experimentation or bricolage needed to address wicked problems (Briggs, 2007).

the social complexity of wicked problems as much as their technical difficulties that make them tough to manage
(Camillus, 2008)

Where to from here is the million-dollar question although there are some ideas in the literature about tackling wicked problems that require exploring. Two in particular grabbed my attention given my interest in complex adaptive systems:

  • Involve stakeholders, document opinions and communicate, especially horizontally. This appears to align with the complexity thinking around ongoing ethnographic collection and growing the network conduits between agents. There are also some links to the self-assertive and integrative paper that David has mentioned.
  • Focus on action. Something I’ve been banging on about recently in regards to learning analytics. Detailed planning and analysis are of little use in complex systems, or in this case, with wicked problems, as the future systems states cannot be predicted due to unknowable effects stemming from interaction between agents. Take a number of small-scale actions and monitor for emergence and repeat. As opposed to upfront planning and analysis, then a single course of action. Safe-fail probes is the term that Snowden uses, and it makes a lot of sense (no pun intended).

It is safe to say that there are no silver bullets when it comes to student attrition. However, I believe there is scope to start thinking about and tackling attrition differently. Attrition is a complex multi-causal issue that the sector continues to try and address using SET mindsets and methods. I’m saying we need to think about it differently, and perhaps engage in some BAD practices.

References

Briggs, L. (2007). Tackling wicked problems: A public policy perspective. Canberra: Australian Government, Commonwealth of Australia.

Camillus, J. C. (2008). Strategy as a Wicked Problem. Harvard Business Review, 86(5), 98-106. Retrieved from http://ezproxy.cqu.edu.au/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=31730150&site=eds-live&scope=site

Maher, M., & Macallister, H. (2013). Retention and attrition of students in higher education: Challenges in modern times to what works. Higher Education Studies, 3(2), p62.

Is learning analytics hamstrung from the outset?

Over the last six months I’ve been writing about student attrition and retention with a colleague from work. We’ve submitted a couple of journal articles that are currently in review about how universities continue to misinterpret the nature of their student attrition issue. To cut a long story short, we argue that attrition is a wicked problem, a problem in complexity. Climate change, deforestation and geopolitical conflicts are examples of wicked problems where there are no single solutions or even any obvious paths towards solution. Conceptualizing student attrition as a wicked problem occurring within non-linear, complex systems changes how we approach these types of issues. As an aside, even classifying attrition as a problem (or issue) the wrong way to think about it. It’s more like a symptom of a network of problems, a network where we can’t possible know what or where most of the nodes are.

While writing these papers I saw some similarities between how universities are approaching student attrition and how they are approach learning analytics adoption. In both cases they have mis-specified the nature of the organization. Their approaches are based on assumptions that the organisations are machine-like.

“Managers want workers to respond predictably to incentives and to accomplish goals defined by managers and to do this with little deviation from plans that management has developed to improve performance”(McDaniel, 2007)

The Machine

The machine like model of organizations is associated with management approaches based on command, control and planning (McDaniel, 2007). This is a valid approach for managing in a linear, stable environment where future states can be anticipated. In fact these approaches depend on the ability of managers and workers to forecast future system states (McDaniel, 2007). However, if we view organisations and the environments in which they operate as complex adaptive systems, machine-model management no longer works. It is simply not possible to predict future states when the systems are made up of agents that are information processors with the capacity to modify their behavior based on information they receive (J. Holland, 2006; J. H. Holland, 1995). An important contrast between viewing an organization as a machine or as a complex adaptive system is the diversity of the agents within the system. Complex adaptive systems encourage diversity whereas the machine model tends to favor agent homogenization (McDaniel, 2007).

“Participation of clinicians in hospital strategic decision making is more helpful in terms of bottom line performance than the participation of middle managers”
(Ashmos, Duchon, McDaniel Jr, & Huonker, 2002)

“If we want workers to be able to improve performance in the face of unknowability, we must invest in efforts to help them make sense of the world in a way that enables the organisation to take action and to learn about the world from the actions that are taken”
(McDaniel, 2007)

So we appear have a mis-interpretation of the actual nature of organisations and the environments in which they operate. Organisations that are composed of information processing agents that change and adapt with the information they receive. And the rapid spread of hype around learning analytics, fueled by commercial entities, aimed at assisting decision-makers at many different levels of the academy. While I believe that learning analytics has enormous potential, I can’t help wondering if the fundamentally misunderstood nature of organisations is going to be its greatest limiting factor, especially when we are talking about information/action cycles.

 References

Ashmos, D. P., Duchon, D., McDaniel Jr, R. R., & Huonker, J. W. (2002). What a mess! Participation as a simple managerial rule to ‘complexify’organizations. Journal of Management studies, 39(2), 189-206.

Holland, J. (2006). Studying Complex Adaptive Systems. Journal of Systems Science and Complexity, 19(1), 1-8. doi:10.1007/s11424-006-0001-z

Holland, J. H. (1995). Hidden order : how adaptation builds complexity / John H. Holland: Reading, Mass. : Addison-Wesley, c1995.

McDaniel, R. R., Jr. (2007). Management Strategies for Complex Adaptive Systems: Sensemaking, Learning, and Improvisation. Performance Improvement Quarterly, 20(2), 21-41. Retrieved from http://onlinelibrary.wiley.com/store/10.1111/j.1937-8327.2007.tb00438.x/asset/j.1937-8327.2007.tb00438.x.pdf?v=1&t=h672itdt&s=a7c341c21237351ad995bdb34074c98db94bb026