• Users Online: 1114
  • Print this page
  • Email this page


 
 
Table of Contents
METHODOLOGY
Year : 2020  |  Volume : 3  |  Issue : 3  |  Page : 93-100

Assessing equivalency during the implementation of a health profession's program to a distant campus: Development of a distant campus evaluation tool


Department of Physician Assistant Studies, Grand Valley State University, Grand Rapids, Michigan, USA

Date of Submission08-Jul-2020
Date of Acceptance07-Aug-2020
Date of Web Publication6-Nov-2020

Correspondence Address:
Dr. Theresa A Bacon-Baguley
301 Michigan Street NE, Grand Valley State University, Grand Rapids, Michigan 49503
USA
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/EHP.EHP_26_20

Rights and Permissions
  Abstract 


Universities are constantly looking at technology driven approaches in providing distant education. As technology evolves, another layer of evaluation is essential to assure the mode of delivery does not interfere with the delivery of content. Although guidelines are available for evaluating an academic program, there are no evaluation tools published to assess the equivalency of education when a program expands to a distant campus. This manuscript describes the development and utilization of a tool to assess equivalency in delivery of a program to a distant campus that utilizes videoconferencing technology in delivering the curriculum. The tool was based on four major areas of potential concern when using technology to deliver the curriculum: (1) technology limitations, (2) confidence in the system, (3) faculty ability to deliver the content, and (4) resources available at each site. Data obtained from implementing the tool identified site specific concerns, as well as concerns common to both sites.

Keywords: Assessment, distant education, physician assistant, videoconferencing


How to cite this article:
Bacon-Baguley TA, Reinhold MI. Assessing equivalency during the implementation of a health profession's program to a distant campus: Development of a distant campus evaluation tool. Educ Health Prof 2020;3:93-100

How to cite this URL:
Bacon-Baguley TA, Reinhold MI. Assessing equivalency during the implementation of a health profession's program to a distant campus: Development of a distant campus evaluation tool. Educ Health Prof [serial online] 2020 [cited 2020 Dec 4];3:93-100. Available from: https://www.ehpjournal.com/text.asp?2020/3/3/93/300075




  Introduction Top


It has long been recognized that health disparities exist in the rural areas throughout the United States. Not only is there a decrease in the availability of medical specialists in the rural areas (30/10,000 individuals compared to 263 in urban areas), but there is also a lack of primary-care providers (13.1/10,000 people compared to 31.2 in the urban areas).[1] Health inequity data identify that individuals in the rural areas have a greater incidence of chronic diseases such as diabetes and coronary heart disease, a higher percentage on Medicaid, greater percentage of uninsured, and lower life expectancy.[1],[2],[3] Socioeconomic factors in the rural areas also contribute to health disparities. According to published socioeconomic data from the Rural Health Information Hub,[1] rural residents tend to be poorer, have greater transportation difficulty limiting access to health care, and less access to internet-based health resources.[4] Educating health-care providers to address these health disparities requires the institutions of higher education to be creative in the development and/or expansion of professional programs to meet the needs within these areas.

One mechanism to address health disparities is to educate individuals who live in the rural areas to provide medical care within the same region they reside. It has been established that training health-care providers in a rural or underserved community results in a greater retention of those providers within those areas. Multiple studies have been done to determine factors that influence recruitment and retention of health-care providers in the rural communities. Those factors include exposure to rural communities during medical training, origin in a rural setting, and various financial, professional, and lifestyle issues.[5],[6],[7],[8],[9] In addition, medical schools where admission policies favor applicants with rural backgrounds and interest in primary care have shown to be successful rural medical education programs.[10],[11],[12],[13],[14] In a systematic review, Rabinowitz et al.[15] identified that specially designed rural medical education programs were effective and went so far as to suggest that expansion of like programs in medical schools will meet the need for more rural primary care providers. In 2011, others confirmed this approach in a nonrandomized intervention study with multiple control groups.[16] More recently, in 2015, researchers identified that training of medical students in a rural country not only correlates directly with an increase in the number of primary care physicians (P < 0.001), but it is also correlated with improved population health in the respective county (P = 0.005).[17]

Rooted in primary care, Physician Assistants (PAs) are one of the health-care professionals who are in alignment to provide primary care to individuals in the rural areas.[18],[19] Delivery of a PA program through distant education into rural areas to address the health-care disparities is one avenue for addressing health disparities. Expansion of a PA program to a distant site requires the assessment of available resources for program delivery, as well as approval from the external organizations such as accreditation agencies. According to the 2020 5th Edition Standards of the Accreditation Review Commission on Education for the PA (ARC-PA) expansion to a distant campus requires that the program must ensure “educational equivalency of course content, student experience, and access to didactic and laboratory materials.”[20] A review of distant programs identified different approaches in the delivery of distance education such as videoconferencing, audio and video recordings, interactive applications, video animations, interactive online discussion forums, and teleconferencing.[21],[22] The advent of accessible internet enabled teleconferencing has emerged as one of the most common method of delivery of education to distant campuses. Ricci et al.[23] reported that 79% of individuals found teleconferencing at least as effective as face-to-face in person lectures. Gray et al.[24] also reported that telepresence (a form of teleconferencing) was found to be beneficial to learning and teaching and superior to other systems participants had used. The authors also reported that the audiovisual quality, resulting intimacy, convenience, and ease of use helped facilitate teaching and learning. Bauer et al.[25] attributes these positive attributes of telepresence to the high-fidelity audio and video which allows for an immersive collaborative experience between the two locations. Based on the above evidence, the program determined that telepresence technology would meet the criteria for the successful PA program expansion to a rural site as well as the requirements of the ARC-PA, the PA accrediting agency.

An assessment process was established for the implementation of the distant campus that followed an intentional and reflective process to ensure compliance with the accreditation standards set by the accreditation body, ARC-PA. This process included the four components of the assessment cycle: (1) establish goals/outcomes, (2) develop and implement assessment strategies, (3) collect assessment data, and (4) utilize data for improvement. Student, faculty, and preceptor evaluations together with student outcomes were utilized. However, the establishment of the distant campus required additional assessment to assure that the expansion of the program was providing equivalent education to students at both campuses.

This model of assessment was essential for assuring that the expansion of the program was providing equivalent education to students at both campuses. A review of the literature revealed that there was no publicly available evaluation tool to assess the implementation of a distant campus. Therefore, an evaluation tool was developed based on evidence-based literature that identified four potential areas of concern when delivering education at a distant campus: Technology equipment, technology reliability, utilization of technology by faculty, and resources available at each site. This article describes the program's implementation of the distant campus utilizing the evaluation tool and describes the tool's value in assessing the educational delivery using Telepresence technology.


  Methods Top


Evaluation tool development

The systematic review of the literature identified four major areas of potential concerns utilizing teleconferencing in the delivery of curricular content: (1) items pertaining to the technology equipment for the delivery system, (2) items pertaining to the reliability of the delivery system, (3) items pertaining to the faculty utilization of the technology to deliver the curriculum content, and (4) items pertaining to the resources at each campus. Within the four areas, sub-components were identified [Table 1]. These items were placed within an evaluation tool and the appropriate Likert type scale was determined for each area [Table 1] (available on request). The tool was administered through Survey Monkey @ twice a semester during the Fall and Winter semesters and once during the Summer semester to students at both campuses (48 total students) across the didactic phase of the curriculum to the inaugural class. After administering the survey, the results were downloaded and analyzed for equivalency. In addition, the internal consistency of the survey and its four subscales was estimated by the Cronbach alpha statistics.
Table 1: Items of concern identified when delivering curriculum to a distant campus

Click here to view


Analysis of data

Data are presented as the percentage of responses in the combined positive categories of “Very Good and Good,” “Very High Confidence and High Confidence,” and “Always and Often” for the entire aggregate (combining student responses at both campus sites) and for each individual campus. In addition, the average positive responses within each of the four areas were calculated to present a composite within each of the four areas. Percent change and trends were analyzed. No statistical corrections such as weighting were used.

Establishment of benchmarks for the evaluation tool

The following benchmarks for the responses to the survey were established:

  1. 80% or greater of the aggregate (both campuses combined) and independent location responses should be in the top two positive categories of each set of statements: “Very Good” and “Good;” “Very High Confidence” and “High Confidence;” or “Always” and “Often.” A percentage of 80% or greater in positive responses requires no action
  2. Responses between 70 and 79% in the top two positive categories for the total aggregate and independent locations will need to be addressed. However, action may or may not be required
  3. Responses <70% in the top two positive categories for the total aggregate and independent locations will require analysis and an action plan for improvement.



  Results Top


Internal consistency of the newly developed survey

The Cronbach alpha values indicate that the survey has high to strong internal consistency with all Cronbach alpha values above 0.77. The Cronbach alpha consistency coefficient for the entire survey and each of the four subscales using an individual student as the unit of the analysis, ranged between 0.77 and 0.92.

The results are presented based on the four areas evaluated. For each of the four areas, data are shown for the aggregate (student responses from both campus sites) and for the independent campuses labeled as Campus #1 (home campus, n = 36) and Campus #2 (distant campus, n = 12). Data are presented as the percentage of students responding favorably to the assessment statement and analyzed according to benchmarks established.

Evaluation of technology equipment for the delivery system

[Table 2] identifies the percentage of responses within the positive categories: the benchmark was established at 80% or greater of responses in the positive categories (”Very Good” and “Good”). As evident in the table, a majority of responses met the benchmark of 80% or greater. The tool identified that the projection of the faculty lecturing did not meet the benchmark in Fall End-of-Semester for the Aggregate (70%), Campus #1 (75%), and Campus #2 (58%). Qualitative comments identified projection of faculty were too small. Based on these results, the system was updated with a larger screen and projector which appeared to resolve the problem as subsequent data met the benchmark. Another statement not meeting the benchmark was, “The quality of the document camera projection (i.e., Elmo).” Qualitative statements identified that the projection at the distant campus was too bright. In response, faculty were instructed to adjust the brightness of the document camera.
Table 2: Percentage of responses in the “very good” and “good” categories for assessing the technology

Click here to view


Evaluation of the reliability of the delivery system

The second area of evaluation was the confidence in the delivery system. As evident in [Table 3], this statement met the benchmark of 80% or greater positive responses at nearly all times of evaluation. There were two time points when Campus #1 (home campus) was below the 80% benchmark (79% in fall mid semester and 77% in summer end of semester). Qualitative comments suggested that faculty at Campus #1 were less knowledgeable about troubleshooting technology than faculty at Campus #2. This finding resulted in immediate education of the faculty on the use of the system.
Table 3: Percentage of responses in “very high confidence” and “high confidence” categories for confidence in the technology

Click here to view


Evaluation of faculty utilization of the technology

The third area of evaluation pertained to the ability of faculty to utilize the delivery system. [Table 4] shows that the statements met the benchmark of 80% or greater positive responses in nearly all times of evaluation and nearly all statements. There were two statements within one semester (fall end-of-semester) that received <80% positive responses. Based on qualitative comments, these two statements reflected comments that some faculty from the main campus (Campus #1) were not as interactive with the students at the distant campus (Campus #2).
Table 4: Percentage of responses in “always” and “often” categories regarding ability of faculty to deliver content

Click here to view


Evaluation of resources at both campuses

The fourth area of evaluation pertained to resources at each campus. [Table 5] identifies multiple statements did not meet the 80% or greater program benchmark of positive responses, with each campus having different statements not meeting the benchmark. Students at the distant campus (Campus #2) during the fall semester of the program identified that the program did not meet the benchmark for the statement “Your ability to access a library or other necessary resources………, when, and if needed.” Responses to this statement improved in subsequent semesters after addressing issues regarding accessibility of library resources [Table 5]. Student responses at the home site (Campus #1) did not meet the benchmark of 80% or greater in positive responses for the statement “Ability to access instructional technology equipment: computers, printers, scanners, etc.” Qualitative statements identified access to printers were the reason for the negative evaluation in this area. Additional printers were subsequently made available. This example clearly highlights the need to assess the statements independently for both campuses as each campus had different areas not meeting the benchmark.
Table 5: Percentage of responses in “always” and “often” categories for statements regarding resources at each campus

Click here to view



  Discussion Top


Utilizing technology to deliver an entire didactic curriculum simultaneously to a distant campus is not without unforeseen consequences. However, the development of a rigorous assessment strategy can result in timely intervention before consequences escalate to the point of interference with student learning. This article describes the development of an evaluation tool which facilitated assessment allowing for prompt recognition of concerns and the subsequent modifications.

In the development of the evaluation tool, a systematic review identified four themes around potential concerns. The first area of potential concern, technology equipment used in delivery system, was further stratified into the following sub-categories: screen projection of both the faculty and the lecture content (PowerPoint), audio transmission between the two sites, auxiliary equipment (document camera), and faculty ability to troubleshoot the equipment if needed. When assessing each site independently, there were site specific concerns, as well as common concerns at both locations. Early during program implementation both sites responded that the screen projecting the faculty was too small. This was resolved as evidenced in subsequent surveys after a larger screen was installed. Specific to the distant campus (Campus #2) were concerns over the document camera, screen size of the presentation materials (PowerPoint), and quality of the videos. These too were resolved after the screen size was enlarged for the presentation materials, and faculty were instructed on adjusting the lighting for the document camera. Student comments regarding the limited size of projection of faculty and presentation materials (PowerPoint) are not unique. Gray et al.[24] identified that one barrier to their use of telepresence in delivering medical education came from the size of the displayed image. Despite the issue with the size of the displayed image, Gray et al.[24] found that students involved in telepresence education gave high satisfaction scores in visual quality: a score of 5.46 based on a Likert scale of 1–6 with 6 representing “completely satisfied.” Although we did not find any concerns with the audio component of the transmission, others have documented problems using older technology such as ITV, as well as teleconferencing. Gray et al.[24] reported that they observed some minor audio feedback and distortion in up to one-third of their transmissions between campuses. A major factor for our not having audio issues may be based on the installation of quality microphones and speakers which are the essential components of telepresence.[26] Earlier studies involving distant education also identified audio responses at distant sites as being delayed or decreased and attributed the situation to technology, as well as faculty inability to adjust feedback or background noise.[27]

Confidence in the technology was the second area in the evaluation tool. When analyzed as an aggregate, the program met its benchmark in the positive responses. However, when analyzing site specific results, there were two time points when the distant campus did not meet the benchmark. Qualitative responses identified that confidence in the system was linked to the faculty knowledge of the technology. Even with the lack of interruption of the technology during the academic year, students at the home campus perceived that if there was an interruption, certain faculty may not be able to handle the situation. Our findings linking confidence in the technology with faculty knowledge is similar to the findings of Macintosch [28] who reported that learners have greater difficulty adjusting to technology in classes where educators were not completely familiar with the technology.

Faculty ability to deliver the curriculum content utilizing the technology was the third area of potential concern. This area addressed the following topics: clarity of assignments, faculty ability to interact with students at both sites, and faculty response to student's questions in a timely manner. Analyzed as an aggregate, this area met the benchmark at every timepoint. When analyzing each of the sites independently, there were subcategories that did not meet benchmark at the distant campus. Areas not meeting benchmark were the ability of students to feel equally engaged and equally able to participate. Qualitative comments identified faculty at the home campus did not engage students at the distant site. Kennedy et al.[27] describes that this type of response can be observed when the distant students' feel that the instructors do not pay attention to them at the distant campus. Kennedy et al.[27] suggests that instructors should acknowledge students at the distant site by verbally recognizing who is in the distant classroom and when this does not occur, the distant site students feel like second-class citizens. Based on the responses obtained from the evaluation tool, the program instituted formal in-services to reinforce ways to keep students at both campuses engaged. These techniques included interactive activities in the classroom, frequent check-in time points for faculty to acknowledge student's at the distant campus, and travel to the distant campus to meet with students. As evident in the subsequent surveys, these instructional reviews did improve performance in this area.

The last area of potential concern pertained to resources at each site. The resources that were evaluated and subsequently assessed included the following: ability to access a library or other necessary resources, accessibility of support staff, promptness with which class materials were delivered/sent, ability to access necessary laboratory equipment, availability and/or access to student services, and ability to access instructional technology equipment. Analysis identified the distant campus initially had the concerns regarding library resources and access to disability services. Concerns improved after students were provided instruction from library personnel and disability services. The home campus identified a lack of printers. Although additional printers were added to the home campus, students continued to respond that there were not enough printers. It should be noted that students at the home campus have to compete with students from other health programs for access to printing while students at the distant campus do not have the competition for printing. In addition, students at the home campus did not meet the benchmark for accessibility to laboratory equipment which was resolved when students at the home campus were granted keycard access during open building hours. Students at the distant campus already had access to the laboratory space during open building hours, and therefore, access to laboratory space was not identified as a concern.

Implementation of a distant campus utilizing a newly developed tool is not without limitations. With any new tool, there is always potential for adaptations based on initial analysis, as well as based on changes in delivery and technology. Despite a lack of issues with classroom connectivity, a modification of the tool was made to address issues arising from a connection problem between the two campuses [Table 1]. Not analyzed, but a potential limitation of the study are the characteristics of the population at each site. Students at the distant campus were selected based on personal goals to contribute to rural health while the home campus did not have this requirement. Other limitations include those common to administering survey tools such as students' not answering all statements due to survey burnout, as well as personal experiences which may influence a student's perception creating bias in the data. Future studies involve gathering of data from subsequent years to determine difference between the inaugural class and subsequent classes while utilizing the tool for equivalency.


  Conclusions Top


Universities are constantly attempting to expand their markets resulting in a greater dependence on technology. Like all technology, there are strengths and weaknesses both in the equipment and in the human component of using the equipment. This study exemplified such. By planning an evaluation and assessment strategy during the implementation and on-going phases of a distant campus, we were able to identify concern with the technology (i.e., screen size), concerns with faculty delivery (i.e., faculty engagement with students at both sites), and concerns with resources (i.e., availability of printers, access to laboratory equipment, and library resources). These concerns were quickly addressed resulting in the resolution of most issues. All students from each campus have since graduated and all have successfully passed their certification exam, underscoring the successful implementation of the distant campus.

Financial support and sponsorship

This work was financially supported by the United States Health Resources and Services Administration under Grant #207025.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Rural Health Information Hub. Social Determinants of Health. Available from: https://www.ruralhealthinfo.org. [Last accessed on 2020 Mar 10].  Back to cited text no. 1
    
2.
O'Connor A, Wellenius G. Rural-urban disparities in the prevalence of diabetes and coronary heart disease. Public Health 2012;126:813-20.  Back to cited text no. 2
    
3.
Newkirk V, Damico, A. The Affordable Care Act and Insurance Coverage in Rural Areas. Kaiser Commission on Medicaid and the Uninsured; 29 May, 2014. Available from: https://www.kff.org/uninsured/issue-brief/the-affordable-care-act-and-insurance-coverage-in-rural-areas/. [Last accessed on 2020 Mar 20].  Back to cited text no. 3
    
4.
Federal Communications Commission Process Reform act of 2015. Congressional Report. U.S. Congress. Senate Committee on Commerce, Science, and Transportation. Available from: https://congressional.proquest.com/congressional/docview/t05.d06.2016-s263-40?accountid=39473. [Last accessed on 2020 Mar 10].  Back to cited text no. 4
    
5.
Rabinowitz HK. Recruitment, retention, and follow-up of graduates of a program to increase the number of family physicians in rural and underserved areas. New England J Med 1993;328:934-9.  Back to cited text no. 5
    
6.
Rabinowitz HK, Diamond JJ, Hojat M, Hazelwood CE. Demographic, educational and economic factors related to recruitment and retention of physicians in rural Pennsylvania. J Rural Health 1999;15:212-8.  Back to cited text no. 6
    
7.
Rourke JT, Rourke LL. Rural family medicine training in Canada. Can Fam Physician 1995;41:993-1000.  Back to cited text no. 7
    
8.
Thommasen HV, Thommasen AT. General practitioner-to-population ratios and long-term family physician retention in British Columbia's health regions. Canadian J Rural Med 2001;6:115-22.  Back to cited text no. 8
    
9.
Azer SA, Simmons D, Elliott SL. Rural training and the state of rural health services: Effect of rural background on the perception and attitude of first-year medical students at the University of Melborne. Australia J Rural Health 2001;9:178-85.  Back to cited text no. 9
    
10.
Ranmuthugala G, Humphreys JS, Solarsh B, Walters L, Worley P, Wakerman J, et al. Where is the evidence that rural exposure increases uptake of rural medical practice? AustralianJournal of Rural health. 2007;15:285-8.  Back to cited text no. 10
    
11.
Rabinowitz HK, Diamond JJ, Markham FW, Wortman JR. Medical school programs to increase the rural physician supply: A systematic review and projected impact of widespread replication. Acad Med 2008;83:235-43.  Back to cited text no. 11
    
12.
Geyman JP, Hart LG, Norris TE, Coombs JB, Lishner DM. Educating generalist physicians for rural practice: How are we doing? J Rural Health 2000;16:56-80.  Back to cited text no. 12
    
13.
Curran V, Rourke J. The role of medical education in the recruitment and retention of rural physicians. Med Teach 2004;26:265-72.  Back to cited text no. 13
    
14.
Henry JA, Edwards BJ, Crotty B. Why do medical graduates choose rural careers? Rural Remote Health 2009;9:1083.  Back to cited text no. 14
    
15.
Rabinowitz HK, Petterson S, Boulger JG, Hunsaker ML, Markham FW, Diamond JJ, et al. Comprehensive medical school rural programs produce rural family physicians. Am Fam Phys 2011;84:1350.  Back to cited text no. 15
    
16.
Wheat JR, Leeper JD, Brandon JE, Guin SM, Jackson JR. The rural medical scholars program study: Data to inform rural health policy. J Am Board Fam Med 2011;24:93-101.  Back to cited text no. 16
    
17.
Wheat JR, Coleman VL, Murphy S, Turberville CM, Leeper JD. Medical education to improve rural population health: A chain of evidence from Alabama: Medical education to improve population health. J Rural Health 2015;31:354-64.  Back to cited text no. 17
    
18.
Yuen CX, Lessard D. Filling the gaps: Predicting physician assistant students' interest in practicing in medically underserved areas.J Phys Assistant Educ2018;29:220-5.  Back to cited text no. 18
    
19.
Gruca TS, Nelson GC, Thiesen L, Asprey DP, Young SG. The workforce trends of physician assistants in Iowa (1995-2015).PLOS One 2018;13:1-14. e0204813.  Back to cited text no. 19
    
20.
ARC-PA. ARC-PA Standards, 5th ed. Available from: http://www.arc-pa.org/wp-content/uploads/2020/07/Standards-5th-Ed-Nov-2019.pdf. [Last accessed on 2020 Jul 27].  Back to cited text no. 20
    
21.
Carrière MF, Harvey D. Current state of distance continuing medical education in North America. J Contin Educ Health Prof 2001;21:150-7.  Back to cited text no. 21
    
22.
Shaw M. Proposing continuing medical education for the Pacific. Pac Health Dialog 2000;7:86-7.  Back to cited text no. 22
    
23.
Ricci MA, Caputo MP, Callas PW, Gagne M. The use of telemedicine for delivering continuing medical education in rural communities. Telemedicine J E-Health 2005;11:124-9.  Back to cited text no. 23
    
24.
Gray K, Krogh K, Newsome D, Smith V, Lancaster D, Nestel D. TelePresence in rural medical education: A mixed methods evaluation.J Biomed Educ 2014:1-8.  Back to cited text no. 24
    
25.
Bauer J, Durakbasa N, Bas G, Guclu E, Kopacek P. Telepresence in education. IFAC-PapersOnLine 2015;48:178-82.  Back to cited text no. 25
    
26.
Hunter TS, Deziel-Evans L, Marsh WA. Assuring excellence in distance pharmaceutical education. Am J Pharm Educ 2003;67:519-43.  Back to cited text no. 26
    
27.
Kennedy DH, Ward CT, Metzner, MC. Distance education: Using compressed interactive video technology for an entry-level doctor of pharmacy program. Am J Pharm Educ 2003;67:118.  Back to cited text no. 27
    
28.
MacIntosh J. Learner concerns and teaching strategies for video-conferencing. J Contin Educ Nurs 2001;32:260-5.  Back to cited text no. 28
    



 
 
    Tables

  [Table 1], [Table 2], [Table 3], [Table 4], [Table 5]



 

Top
 
  Search
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
Abstract
Introduction
Methods
Results
Discussion
Conclusions
References
Article Tables

 Article Access Statistics
    Viewed262    
    Printed14    
    Emailed0    
    PDF Downloaded40    
    Comments [Add]    

Recommend this journal